نوع مقاله : مقاله پژوهشی
نویسنده
گروه عمران، دانشکده مهندسی، واحد شیراز، دانشگاه آزاد اسلامی، شیراز، ایران
چکیده
کلیدواژهها
عنوان مقاله [English]
نویسنده [English]
Precipitation is known as one of the most important factors in a hydrologic cycle. Hence, an accurate knowledge of rainfall data, which are usually gathered through a group of rain gauges, is necessary for any hydrologic simulation on a watershed. Therefore, a rain gauge network design, including determination of an optimum number of stations, is one of the fundamental concerns in any hydrologic study. Several indexes are employed to express accuracy of a rain gauge network. Among the indexes, entropy is one of the well-used concepts, which is utilized to assess the accuracy of the data collected by any rain gauge network. In earlier studies, entropy-based equations were used assuming normality of rainfall data. This assumption may have led to inaccuracy in results. In this study, the validation of the supposition was checked by the actually observed (non-normal) monthly precipitation data of 24 rain gauges in the Province of Fars, Iran. A traditional rain gauge network design based on the entropy concept and the actually observed data was used. The designed network compared with a procedure in which the Box-Cox transformation is used to normalize primary data. The results indicate that the priority of selected stations changes drastically, and entropy growth rate decreases, as the number of the rain gauge stations whose transformed data are used increased.
کلیدواژهها [English]