首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 15 毫秒
1.
《水文科学杂志》2013,58(5):917-935
Abstract

For urban drainage and urban flood modelling applications, fine spatial and temporal rainfall resolution is required. Simulation methods are developed to overcome the problem of data limitations. Although temporal resolution higher than 10–20 minutes is not well suited for detailed rainfall—runoff modelling for urban drainage networks, in the absence of monitored data, longer time intervals can be used for master planning or similar purposes. A methodology is presented for temporal disaggregation and spatial distribution of hourly rainfall fields, tested on observations for a 10-year period at 16 raingauges in the urban catchment of Dalmuir (UK). Daily rainfall time series are simulated with a generalized linear model (GLM). Next, using a single-site disaggregation model, the daily data of the central gauge in the catchment are downscaled to an hourly time scale. This hourly pattern is then applied linearly in space to disaggregate the daily data into hourly rainfall at all sites. Finally, the spatial rainfall field is obtained using inverse distance weighting (IDW) to interpolate the data over the whole catchment. Results are satisfactory: at individual sites within the region the simulated data preserve properties that match the observed statistics to an acceptable level for practical purposes.  相似文献   

2.
Simulation of quick runoff components such as surface runoff and associated soil erosion requires temporal high‐resolution rainfall intensities. However, these data are often not available because such measurements are costly and time consuming. Current rainfall disaggregation methods have shortcomings, especially in generating the distribution of storm events. The objectives of this study were to improve point rainfall disaggregation using a new magnitude category rainfall disaggregation approach. The procedure is introduced using a coupled disaggregation approach (Hyetos and cascade) for multisite rainfall disaggregation. The new procedure was tested with ten long‐term precipitation data sets of central Germany using summer and winter precipitation to determine seasonal variability. Results showed that dividing the rainfall amount into four daily rainfall magnitude categories (1–10, 11–25, 26–50, >50 mm) improves the simulation of high rainfall intensity (convective rainfall). The Hyetos model category approach (HyetosCat) with seasonal variation performs representative to observed hourly rainfall compared with without categories on each month. The mean absolute percentage accuracy of standard deviation for hourly rainfall is 89.7% in winter and 95.6% in summer. The proposed magnitude category method applied with the coupled HyetosCat–cascade approach reproduces successfully the statistical behaviour of local 10‐min rainfall intensities in terms of intermittency as well as variability. The root mean square error performance statistics for disaggregated 10‐min rainfall depth ranges from 0.20 to 2.38 mm for summer and from 0.12 to 2.82 mm for the winter season in all categories. The coupled stochastic approach preserves the statistical self‐similarity and intermittency at each magnitude category with a relatively low computational burden. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

3.
 The purpose of this paper is to present a graphical method to characterise the nature of a distribution (exponential or algebraic). In the algebraic case, this statistical tool provides an estimation procedure of the parameter characterising the decrease of the survival function. The realizations of the random variable under study being available in the form of time series, this method is based on the relationship between the duration of exceeding an intensity threshold and the accumulation of the realizations of the random variable during this length of time. The behaviour of the duration-accumulation graphs (when the threshold of reference increases indefinitely) results in a function, the limit of which only depends on the parameter characterising the algebraic decrease of the probability distribution. The estimate of this parameter is biased but can be corrected effectively by numerical methods. We applied this method to two rainfall series differing by their geographical origin (Dédougou in Burkina Faso and a station on the Island of La Réunion) and their time step (respectively 1 day and 76 seconds). For both of them, the behaviour of tail distributions is shown to be algebraic and the values of the parameter characterizing the algebraic decrease of the probability distribution of the two series are very close. This would tend to justify the assumption of a multifractal nature for these series. This work was achieved as part of the National Programme of Research in Hydrology of the INSU (project 99 PNRH 27). The authors are grateful to A. Barcello for providing them the data of the Island of La Réunion Island.  相似文献   

4.
 As the use of space-based sensors to observe soil moisture is becoming more plausible, it is becoming necessary to validate the remotely sensed soil moisture retrieval algorithms. In this paper, measurements of point gauges on the ground are analyzed as a possible ground-truth source for the comparison with remotely sensed data. The design compares a sequence of measurements taken on the ground and from space. The authors review the mean square error of expected differences between the two systems by Ha and North (1994), which is applied to the Little Washita watershed using the soil moisture dynamics model developed by Entekhabi and Rodriguez-Iturbe (1994). The model parameters estimated by Yoo and Shin (1998) for the Washita `92 (relative) soil moisture data are used in this study. By considering about 20 pairs of ground- and space-based measure-ments (especially, for the same case as the Washita `92 that the space-based sensor visits the FOV once a day), the expected error was able to be reduced to approximately 10 of the standard deviation of the fluctuations of the system alone. This seems to be an acceptable level of tolerance for identifying biases in the retrieval algorithms.  相似文献   

5.
 In geostatistics, stochastic simulation is often used either as an improved interpolation algorithm or as a measure of the spatial uncertainty. Hence, it is crucial to assess how fast realization-based statistics converge towards model-based statistics (i.e. histogram, variogram) since in theory such a match is guaranteed only on average over a number of realizations. This can be strongly affected by the random number generator being used. Moreover, the usual assumption of independence among simulated realizations of a random process may be affected by the random number generator used. Simulation results, obtained by using three different random number generators implemented in Geostatistical Software Library (GSLib), are compared. Some practical aspects are pointed out and some suggestions are given to users of the unconditional LU simulation method.  相似文献   

6.
 Logarithmic sensitivities and plausible relative errors are studied in a simple no-crossflow model of a transient flowmeter test (TFMT). This model is identical to the model of a constant-rate pumping test conducted on a fully penetrating well with wellbore storage, surrounded by a thick skin zone, and situated in a homogeneous confined aquifer. The sensitivities of wellbore drawdown and wellface flowrate to aquifer and skin parameters are independent of the pumping rate. However, the plausible relative errors in the aquifer and skin parameters estimated from drawdown and wellface flowrate data can be proportionally decreased by increasing the pumping rate. The plausible relative errors vary by many orders of magnitude from the beginning of the TFMT. The practically important flowrate and drawdown measurements in this test, for which the plausible relative errors vary by less than one order of magnitude from the minimum plausible relative errors, can begin approximately when the dimensionless wellface flowrate exceeds q D =q/Q≈0.4. During most of this stage of the test, the plausible relative errors in aquifer hydraulic conductivity (K a ) are generally an order of magnitude smaller than those in aquifer specific storativity. The plausible relative errors in the skin hydraulic conductivity (K s ) are generally larger than the plausible relative errors in the aquifer specific storativity when the thick skin is normal (K s >K a ) and smaller when the thick skin is damaged (K s <K a ). The specific storativity of the skin zone would be so biased that one should not even attempt to estimate it from the TFMT. We acknowledge Wiebe H. van der Molen for recommending the De Hoog algorithm and sharing his code. This research was partially supported by the US Geological Survey, USGS Agreement #1434-HQ-96-GR-02689 and North Carolina Water Resources Research Institute, WRRI Project #70165.  相似文献   

7.
 The conventional nonparametric tests have been widely used in many fields for the residual analysis of a fitted model on observations. Also, in recent, a new technique called the BDS (Brock–Dechert–Scheinkman) statistic has been shown that it can be used as a powerful tool for the residual analysis, especially, of a nonlinear system. The purpose of this study is to compare the powers of the nonparametric tests and BDS statistic by residual analysis of the fitted models. This study evaluates stochastic models for four monthly rainfalls in Korea through the residual analysis by using the conventional nonparametric and BDS statistics. We use SARIMA and AR Error models for fitting each rainfall and perform the residual analysis by using the test techniques. As a result, we find that the BDS statistic is more reasonable than the conventional nonparametric tests for the residual analysis and AR Error model may be more appropriate than SARIMA model for modeling of monthly rainfalls. This work was supported by grant No. R01-2001-000-00474-0 from the Basic Research Program of the Korea Science & Engineering Foundation.  相似文献   

8.
 An imbedded renewal process as a model of precipitation occurrence is fairly well known. Data permitting verification of assumptions determining such a model, for example independence and distribution of sojourn times, are available rather rarely, however, due to problems with definition of a precipitation. An operational definition of a precipitation is obtained by associating the precipitation occurrence with an opening sojourn time of a precipitation collector. This paper describes identification of the renewal model using such definition and applies the model to compare several different precipitation collectors.  相似文献   

9.
 There exist many sites with contaminated groundwater because of inappropriate handling or disposal of hazardous materials or wastes. Health risk assessment is an important tool to evaluate the potential environmental and health impacts of these contaminated sites. It is also becoming an important basis for determining whether risk reduction is needed and what actions should be initiated. However, in research related to groundwater risk assessment and management, consideration of multimedia risk assessment and the separation of the uncertainty due to lack of knowledge and the variability due to natural heterogeneity are rare. This study presents a multimedia risk assessment framework with the integration of multimedia transfer and multi-pathway exposure of groundwater contaminants, and investigates whether multimedia risk assessment and the separation of uncertainty and variability can provide a better basis for risk management decisions. The results of the case study show that a decision based on multimedia risk assessment may differ from one based on risk resulting from groundwater only. In particular, the transfer from groundwater to air imposes a health threat to some degree. By using a methodology that combines Monte Carlo simulation, a rank correlation coefficient, and an explicit decision criterion to identify information important to the decision, the results obtained when uncertainty and variability are separate differ from the ones without such separation. In particular, when higher percentiles of uncertainty and variability distributions are considered, the method separating uncertainty and variability identifies TCE concentration as the single most important input parameter, while the method that does not distinguish the two identifies four input parameters as the important information that would influence a decision on risk reduction.  相似文献   

10.
 Least squares (LS) techniques, like Kalman filtering, are widely used in environmental science and engineering. In this paper, a new general approach is introduced for the study of the generation, propagation and accumulation of the quantization error in any algorithm. This methodology employs a number of fundamental propositions demonstrating the way the four operations addition, multiplication, division and subtraction, influence quantization error generation and transmission. Using these, one can obtain knowledge of the exact number of erroneous digits with which all quantities of any algorithm are computed at each step of it. This methodology offers understanding of the actual cause of the generation and propagation of finite precision error in any computational scheme. Application of this approach to all Kalman type LS algorithms shows that not all their formulas are equivalent concerning the quantization error effects. More specifically, few generate the greater amount of quantization error. Finally, a stabilization procedure, applicable to all Kalman type algorithms, is introduced that renders all these algorithms very robust.  相似文献   

11.
 Although the strict legislation regarding vehicle emissions in Europe (EURO 4, EURO 5) will lead to a remarkable reduction of emissions in the near future, traffic related air pollution still can be problematic due to a large increase of traffic in certain areas. Many dispersion models for line-sources have been developed to assess the impact of traffic on the air pollution levels near roads, which are in most cases based on the Gaussian equation. Previous studies gave evidence, that such kind of models tend to overestimate concentrations in low wind speed conditions or when the wind direction is almost parallel to the street orientation. This is of particular interest, since such conditions lead generally to the highest observed concentrations in the vicinity of streets. As many air quality directives impose limits on high percentiles of concentrations, it is important to have good estimates of these quantities in environmental assessment studies. The objective of this study is to evaluate a methodology for the computation of especially those high percentiles required by e.g. the EU daughter directive 99/30/EC (for instance the 99.8 percentile for NO2). The model used in this investigation is a Markov Chain – Monte Carlo model to predict pollutant concentrations, which performs well in low wind conditions as is shown here. While usual Lagrangian models use deterministic time steps for the calculation of the turbulent velocities, the model presented here, uses random time steps from a Monte Carlo simulation and a Markov Chain simulation for the sequence of the turbulent velocities. This results in a physically better approach when modelling the dispersion in low wind speed conditions. When Lagrangian dispersion models are used for regulatory purposes, a meteorological pre-processor is necessary to obtain required input quantities like Monin-Obukhov length and friction velocity from routinely observed data. The model and the meteorological pre-processor applied here, were tested against field data taken near a major motorway south of Vienna. The methodology used is based on input parameters, which are also available in usual environmental assessment studies. Results reveal that the approach examined is useful and leads to reasonable concentration levels near motorways compared to observations. We wish to thank Andreas Schopper (Styrian Government) for providing air quality values, M. Kalina for providing the raw data of the air quality stations near the motorway and J. Kukkonen for providing the road site data set from the Finish Meteorological Institute (FMI). The study was partly funded by the Austrian science fund under the project P14075-TEC.  相似文献   

12.
Searching for strange attractor in wastewater flow   总被引:1,自引:0,他引:1  
 Chaos is a complex and irregular world in contrast with simple and regular natures of linear systems. Scientists and engineers have invoked low-dimensional chaos for understanding the nature of real systems. In this study, the complex behavior of a daily wastewater flow and evidence of deterministic nonlinear dynamics are investigated. The analysis involves both a metric approach of the correlation dimension and a topological technique called the close returns plot. The estimation procedure of delay time and delay time window is reviewed using a new technique called the C–C method for the state space reconstruction. And both parameters are used for estimating the correlation dimension. As a result, the daily wastewater flow shows no evidence of chaotic dynamics, which implies that stochastic models rather than deterministic chaos may be more appropriate for representing an investigated series.  相似文献   

13.
 We illustrate a method of global sensitivity analysis and we test it on a preliminary case study in the field of environmental assessment to quantify uncertainty importance in poorly-known model parameters and spatially referenced input data. The focus of the paper is to show how the methodology provides guidance to improve the quality of environmental assessment practices and decision support systems employed in environmental policy. Global sensitivity analysis, coupled with uncertainty analysis, is a tool to assess the robustness of decisions, to understand whether the current state of knowledge on input data and parametric uncertainties is sufficient to enable a decision to be taken. The methodology is applied to a preliminary case study, which is based on a numerical model that employs GIS-based soil data and expert consultation to evaluate an index that joins environmental and economic aspects of land depletion. The index is used as a yardstick by decision-makers involved in the planning of highways to identify the route that minimises the overall impact.  相似文献   

14.
 To preserve biodiversity over centuries, ecosystem management will need to be accepted and practiced by individuals from a broad spectrum of society's strata. Also, management decisions will need to be based on reliable judgments of the cause and effect relationships that govern an ecosystem's dynamics. This article describes an extant, web-based ecosystem management system (EMS) that allows (a) wide participation in ecosystem assessment and policy impact predictions, (b) convenient construction of probabilistic models of ecosystem processes through an influence diagram, and (c) automatic creation of ecosystem assessment reports. For illustration, the system is used to first model the cheetah population in Kenya, and then to assess the impact on this population of different management options. The influence diagram used herein extends standard influence diagram theory to allow representation of variables governed by stochastic differential equations, birth–death processes, and other nongaussian, continuous probability distributions. For many ecosystems, data sets on ecosystem health indicators can be incomplete, small, and contain unknown measurement errors. Some amount of knowledge of an ecosystem's dynamics however, may exist in the form of expert opinion derived from ecological theory. The proposed EMS uses a nonbayesian parameter estimation method, called consistency analysis that finds parameter estimates such that the fitted ecosystem model is as faithful as possible to both the available data and the collected body of expert opinion. For illustration, consistency analysis is used to estimate the cheetah viability influence diagram using all known cheetah surveys in the country of Kenya plus current understanding of factors such as habitat and prey availability that affect cheetah population dynamics.  相似文献   

15.
 The data analyzed in this paper are part of the results described in Bueno et al. (2000). Three cytogenetics endpoints were analyzed in three populations of a species of wild rodent – Akodon montensis – living in an industrial, an agricultural, and a preservation area at the Itajaí Valley, State of Santa Catarina, Brazil. The polychromatic/normochromatic ratio, the mitotic index, and the frequency of micronucleated polychromatic erythrocites were used in an attempt to establish a genotoxic profile of each area. It was assumed that the three populations were in the same conditions with respect to the influence of confounding factors such as animal age, health, nutrition status, presence of pathogens, and intra- and inter-populational genetic variability. Therefore, any differences found in the endpoints analyzed could be attributed to the external agents present in each area. The statistical models used in this paper are mixtures of negative-binomials and Poisson variables. The Poisson variables are used as approximations of binomials for rare events. The mixing distributions are beta densities. The statistical analyzes are under the bayesian perspective, as opposed to the frequentist ones often considered in the literature, as for instance in Bueno et al. (2000).  相似文献   

16.
 A desktop image processing and photogrammetric method was developed for digitizing black-and-white aerial photographs. The technique was applied to airborne optical images of Mt. Pelée, Martinique, a historically active volcano in the tropical Lesser Antilles island arc, to evaluate its utility for rapid geologic mapping and hazard assessment in vegetated areas. The digital approach provides several advantages over traditional air-photo interpretation by allowing for change detection in time-series images, morphologic characterization, development of digital elevation models from stereopairs, and geo-referencing with other digital data sets. A digital mosaic of Mt. Pelée was created from air photos acquired in 1951, which covered the region affected by the 1902 eruption. Severe mismatches occurred along edges of adjacent photographs prior to correction, which precluded quantitative morphologic analysis of the volcanic edifice. Geometric corrections and histogram equalization of digitized air photos allowed creation of a continuous mosaic. Comparison of the mosaic and a map based on differences in gray scale and texture to a volcanostratigraphic map revealed that not only the various deposits produced during the 1902 event were easily differentiated, but that older eruptive products were identified, suggesting that this approach may be used for rapid hazard evaluation of historically active tropical volcanoes. Received: 22 January 1996 / Accepted: 26 July 1996  相似文献   

17.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号