首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
ABSTRACT: The design of monitoring programs often serves as one of the major sources of error or uncertainty in water quality data. Properly designed programs should minimize uncertainty or at least provide a means by which variability can be partitioned into recognizable components. While the design of sampling programs has received recent attention, commonly employed strategies for limnological sampling of lakes may not be completely appropriate for many reservoirs. Based on NES data, reservoirs are generally larger, deeper, and morphologically more complex than natural lakes. Reservoirs also receive a majority of their inflow from a single tributary located a considerable distance from the point of outflow. The result is the establishment of marked physical, biological, and chemical gradients from headwater to dam. The existence of horizontal as well as vertical gradients, and their importance in water quality sampling design were the subject of intensive transect sampling efforts at DeGray Lake, a U.S. Army Corps of Engineers reservoir in southern Arkansas. Data collected were used to partition Variance, identify areas of similarity, and demonstrate how an equitable sampling program might be designed.  相似文献   

2.
ABSTRACT: In the last 30 years, the National Resource Conservation Service's TR‐55 and TR‐20 models have seen a dramatic increase in use for stormwater management purposes. This paper reviews some of the data that were originally used to develop these models and tests how well the models estimate annual series peak runoff rates for the same watersheds using longer historical data record lengths. The paper also explores differences between TR‐55 and TR‐20 peak runoff rate estimates and time of concentration methods. It was found that of the 37 watersheds tested, 25 were either over‐ or under‐predicting the actual historical watershed runoff rates by more than 30 percent. The results of this study indicate that these NRCS models should not be used to model small wooded watersheds less than 20 acres. This would be especially true if the watershed consisted of an area without a clearly defined outlet channel. This study also supports the need for regulators to allow educated hydrologists to alter pre‐packaged model parameters or results more easily than is currently permitted.  相似文献   

3.
ABSTRACT: Various temporal sampling strategies are used to monitor water quality in small streams. To determine how various strategies influence the estimated water quality, frequently collected water quality data from eight small streams (14 to 110 km2) in Wisconsin were systematically subsampled to simulate typically used strategies. These subsets of data were then used to estimate mean, median, and maximum concentrations, and with continuous daily flows used to estimate annual loads (using the regression method) and volumetrically weighted mean concentrations. For each strategy, accuracy and precision in each summary statistic were evaluated by comparison with concentrations and loads of total phosphorus and suspended sediment estimated from all available data. The most effective sampling strategy depends on the statistic of interest and study duration. For mean and median concentrations, the most frequent fixed period sampling economically feasible is best. For maximum concentrations, any strategy with samples at or prior to peak flow is best. The best sampling strategy to estimate loads depends on the study duration. For one‐year studies, fixed period monthly sampling supplemented with storm chasing was best, even though loads were overestimated by 25 to 50 percent. For two to three‐year load studies and estimating volumetrically weighted mean concentrations, fixed period semimonthly sampling was best.  相似文献   

4.
ABSTRACT: A general framework is proposed for using precipitation estimates from NEXRAD weather radars in raingage network design. NEXRAD precipitation products are used to represent space time rainfall fields, which can be sampled by hypothetical raingage networks. A stochastic model is used to simulate gage observations based on the areal average precipitation for radar grid cells. The stochastic model accounts for subgrid variability of precipitation within the cell and gage measurement errors. The approach is ideally suited to raingage network design in regions with strong climatic variations in rainfall where conventional methods are sometimes lacking. A case study example involving the estimation of areal average precipitation for catchments in the Catskill Mountains illustrates the approach. The case study shows how the simulation approach can be used to quantify the effects of gage density, basin size, spatial variation of precipitation, and gage measurement error, on network estimates of areal average precipitation. Although the quality of NEXRAD precipitation products imposes limitations on their use in network design, weather radars can provide valuable information for empirical assessment of rain‐gage network estimation errors. Still, the biggest challenge in quantifying estimation errors is understanding subgrid spatial variability. The results from the case study show that the spatial correlation of precipitation at subgrid scales (4 km and less) is difficult to quantify, especially for short sampling durations. Network estimation errors for hourly precipitation are extremely sensitive to the uncertainty in subgrid spatial variability, although for storm total accumulation, they are much less sensitive.  相似文献   

5.
ABSTRACT: The ability to predict extreme floods is an important part of the planning process for any water project for which failure will be very costly. The length of a gage record available for use in estimating extreme flows is generally much shorter than the recurrence interval of the desired flows, resulting in estimates having a high degree of uncertainty. Maximum likelihood estimators of the parameters of the three parameter lognormal (3PLN) distribution, which make use of historical data, are presented. A Monte Carlo study of extreme flows estimated from samples drawn from three hypothetical 3PLN populations showed that inclusion of historical flows with the gage record reduced the bias and variance of extreme flow estimates. Asymptotic theory approximations of parameter variances and covariances calculated using the second and mixed partial derivatives of the log likelihood function agreed well with Monte Carlo results. First order approximations of the standard deviations of the extreme flow estimates did not agree with the Monte Carlo results. An alternative method for calculating those standard deviations, the “asymptotic simulation” method, is described. The standard deviations calculated by asymptotic simulation agree well with the Monte Carlo results.  相似文献   

6.
ABSTRACT: Forecasts of 1980 river basin water use presented in the reports of the 1960 Senate Select Committee on National Water Resources and in the Water Resources Council's First National Water Assessment of 1968 were compared to estimates of actual use in 1980 to assess the accuracy of efforts to forecast future water use. Results show that the majority of the forecasts were substantially in error. In general, the First National Assessment forecasts erred by a smaller margin, but tended to repeat the regional patterns of overestimation (underestimation) exhibited in the Senate Select Committee forecasts. Moreover, forecasts of the two groups that came within 20 percent of the 1980 withdrawals, in general were accurate, not because of superior prediction, but because of offsetting errors in forecast components. This performance leads us to conclude that water use forecasts, regardless of the time-frame or the forecast method employed, are likely to always be highly inaccurate. Accordingly, if such forecasting efforts are to be of value in contemporary water resources planning, forecasters should direct their attention toward methods which will illuminate the determinants of the demand for water.  相似文献   

7.
ABSTRACT: Methods of calculating uncertainty in estimates of serial correlation coefficients, and correcting for bias in short and medium length (less than 50 data point) records, are presented. Uncertainty and bias in the estimation of serial correlation coefficients for ground water quality data is shown to be considerable and to result in inaccurate calculation of the sampling frequencies for monitoring purposes. The methods are applied to a ground water data set consisting of 87 monthly measurements of nitrate concentrations. The variation in serial correlation coefficients with variation of record length is examined. The optimum sampling frequencies for detection of changes in ground water nitrate concentrations are estimated.  相似文献   

8.
ABSTRACT: The Food Quality Protection Act of 1996 requires that human exposure to pesticides through drinking water be considered when establishing pesticide tolerances in food. Several systematic and seasonally weighted systematic sampling strategies for estimating pesticide concentrations in surface water were evaluated through Monte Carlo simulation, using intensive datasets from four sites in northwestern Ohio. The number of samples for the strategies ranged from 4 to 120 per year. Sampling strategies with a minimal sampling frequency outside the growing season can be used for estimating time weighted mean and percentile concentrations of pesticides with little loss of accuracy and precision, compared to strategies with the same sampling frequency year round. Less frequent sampling strategies can be used at large sites. A sampling frequency of 10 times monthly during the pesticide runoff period at a 90 km2 basin and four times monthly at a 16,400 km2 basin provided estimates of the time weighted mean, 90th, 95th, and 99th percentile concentrations that fell within 50 percent of the true value virtually all of the time. By taking into account basin size and the periodic nature of pesticide runoff, costs of obtaining estimates of time weighted mean and percentile pesticide concentrations can be minimized.  相似文献   

9.
ABSTRACT: High-capacity wells are used as a convenient and economical means of sampling groundwater quality. Although the inherent limitations of using these wells are generally recognized, little has been done to investigate how these wells actually sample groundwater. A semi-analytical particle tracking model is used to illustrate the influence of variable vertical contaminant distributions and aquifer heterogeneity on the composition of water samples from these wells during short pumping periods. The hypothetical pumping well used in the simulations is located in an unconfined, alluvial aquifer with a shallow water table and concentration gradients of nitrate-nitrogen contamination. This is a typical setting for many irrigated areas in the United States. The main conclusions are: (1) high-capacity wells underestimate the average amount of contamination within an aquifer; (2) shapes of concentration-time curves for high-capacity wells appear to be governed by the distribution of the contaminant and travel times to the well; (3) variables such as well construction, pumping rate, and hydrogeologic properties contribute to the magnitude of the concentration-time curves at individual high-capacity wells; and (4) a sampling strategy using concentration-time curves based on the behavioral characteristics of the well rather than individual samples will provide a much better framework for interpreting spatial contaminant distributions.  相似文献   

10.
ABSTRACT: This paper addresses two components of the problem of estimating the magnitude of step trends in surface water quality. The first is finding a robust estimator appropriate to the data characteristics expected in water-quality time series. The Hodges-Lehmann class of estimators is found to be robust in comparison to other nonparametric and moment-based estimators. A seasonal Hodges-Lehmann estimator is developed and shown to have desirable properties. Second, the effectiveness of various sampling strategies are examined using Monte Carlo simulation coupled with application of this estimator. The simulation is based on a large set of total phosphorus data from the Potomac River. To assure that the simulated records have realistic properties, the data are modeled in a multiplicative fashion incorporating flow, hysteresis, seasonal, and noise components. The results demonstrate the importance of balancing the length of the two sampling periods and balancing the number of data values between the two periods. The inefficiency of sampling at frequencies much in excess of 12 samples per year is demonstrated. Rotational sampling designs are discussed, and efficient designs, at least for this river and constituent, are shown to involve more than one year of active sampling at frequencies of about 12 per year.  相似文献   

11.
ABSTRACT: Catch in standard (unshielded) rain gages exposed 3 feet above the land surface was compared with catch in pit (buried) gages exposed 1 inch above the land surface. These tests confirmed that catch in standard gages under estimates point rainfall in forest openings, as well as in conventional weather stations. Pit gages caught significantly (P=0.05) more rain than did standard gages at each of four locations tested. Catch increases ranged from 2.3 to 3.4 percent.  相似文献   

12.
A spectral formalism was developed and applied to quantify the sampling errors due to spatial and/or temporal gaps in soil moisture measurements. A design filter was developed to compute the sampling errors for discrete measurements in space and time. This filter has as its advantage a general form applicable to various types of sampling design. The lack of temporal measurements of the two‐dimensional soil moisture field made it difficult to compute the spectra directly from observed records. Therefore, the wave number frequency spectra of soil moisture data derived from stochastic models of rainfall and soil moisture were used. Parameters for both models were estimated using data from the Southern Great Plains Hydrology Experiment (SGP97) and the Oklahoma Mesonet. The estimated sampling error of the spatial average soil moisture measurement by airborne L‐band microwave remote sensing during the SGP97 hydrology experiment is estimated to be 2.4 percent. Under the same climate conditions and soil properties as the SGP97 experiment, equally spaced ground probe networks at intervals of 25 and 50 km are expected to have about 16 percent and 27 percent sampling error, respectively. Satellite designs with temporal gaps of two and three days are expected to have about 6 percent and 9 percent sampling errors, respectively.  相似文献   

13.
ABSTRACT: The precision of width and pool area measurements has rarely been considered in relation to downstream or at section hydraulic geometry, fisheries studies, long-term or along a continuum research studies, or agency monitoring techniques. We assessed this precision and related it to other stream morphologic characteristics. Confidence limits (95 percent) around mean estimates with four transects (cross-sections perpendicular to the channel center-line) ranged from ± 0.4 to 1.8 m on streams with a width of only 2.2 m. To avoid autocorrelation, transects should be spaced about three channel widths apart. To avoid stochastic inhomogeneity, reach length should be about 30 channel widths or ten transects to optimize sampling efficiency. Precision of width measurements decreased with decreased depth and increased with stream size. Both observations reflect variability caused by features such as boulders or coarse woody debris. Pool area precision increased with pool area reflecting increased precision for flat, wide streams with regular pool-rime sequences. The least precision occurred on small, steep streams with random, boulder or coarse woody debris formed pools.  相似文献   

14.
ABSTRACT Significant parameters for predicting thunderstorm runoff from small semiarid watersheds are determined using data from the Walnut Gulch watershed in southern Arizona. Based on these data, thunderstorm rainfall is dominant over watershed parameters for predicting runoff from multiple linear regression equations. In some cases antecedent moisture added significantly to the models. A technique is developed for estimating precision of predicted values from multiple linear regression equations. The technique involves matrix methods in estimating the variance of mean predicted values from a regression equation. The estimated variance of the mean predicted value is then used to estimate the variance of an individual predicted value. A computer program is developed to implement these matrix methods and to form confidence limits on predicted values based on both a normality assumption and the Chebyshev inequality.  相似文献   

15.
ABSTRACT: Programs of monthly or annual stream water sampling will rarely observe the episodic extremes of acidification chemistry that occur during brief, unpredictable runoff events. When viewed in the context of data from several streams, however, baseflow measurements of variables such as acid neutralizing capacity, pH and NO3· are likely to be highly correlated with the episodic extremes of those variables from the same stream and runoff season. We illustrate these correlations for a water chemistry record, nearly two years in length, obtained from intensive sampling of 13 small Northeastern U.S. streams studied during USEPA's Episodic Response Project. For these streams, simple regression models estimate episodic extremes of acid neutralizing capacity, pH, NO3·, Ca2+, SO42?, and total dissolved Al with good relative accuracy from statistics of monthly or annual index samples. Model performances remain generally stable when episodic extremes in the second year of sampling are predicted from first-year models. Monthly or annual sampling designs, in conjunction with simple empirical models calibrated and maintained through intensive sampling every few years, may estimate episodic extremes of acidification chemistry with economy and reasonable accuracy. Such designs would facilitate sampling a large number of streams, thereby yielding estimates of the prevalence of episodic acidification at regional scales.  相似文献   

16.
ABSTRACT: Local governmental agencies responsible for decisions in ground water quality management need not only data on ground water quality but they also must understand the relationship of accuracies and risks associated with this data as related to the number of wells to sample. In this report we address this problem by using the philosophical doctrines of probabilism and relativism with simple statistical procedures. This requires a reasonable estimate of the population variance in a quality parameter for a given management-unit area, and requires that the decisionmaker formulate constraints with an acceptable standard error of the sample mean and be willing to accept some level of probability of being Wrong. This technique is illustrated using a 21-year data base of well water chemical data in a 653 km2 ground water quality study area in the San Joaquin Valley of California.  相似文献   

17.
ABSTRACT: Knowledge of coliform transport and disappearance may provide information for project design and operation that minimizes potential water quality problems such as the violation of body contact recreation standards. Storm events were sampled in the Caddo River above DeGray Reservoir, Arkansas, and then tracked through the reservoir using the increased turbidity associated with the storm flows. Fecal coliforms were sampled both in the river and throughout the water column in the reservoir. In general, increased fecal coliform concentrations were closely associated with the increased turbidity resulting from the storm flows. This association existed for all three types of turbidity plume movement - overflow, interflow, and underflow. As the turbidity plume moved down the reservoir, fecal coliform concentrations decreased due to die-off, settling, and dilution. With several assumptions, it is possible to use this information to assist in locating recreational sites in a reservoir or to anticipate possible body contact standard violations at existing recreation sites.  相似文献   

18.
Summer lake survey measurements of total phosphorus (TP) and chlorophyll a (CHLa) from 188 reserviors and natural lakes in the midwest were analyzed to determine the magnitude of major sources of variability. Median variance among replicate samples collected at the same location and time was about 7-8 percent of the mean for both TP and CHLa. Median observed temporal variability within summers was 27 percent of the mean for TP and 45 percent of the mean for CHLa. Median values of year-to-year variance in average TP and CHLa were 22 percent and 31 percent of the mean, respectively. A range of approximately two orders of magnitude was observed among individual estimates of variance in each of these categories. The magnitude of observed temporal variability was affected only slightly by variance among replicate samples on individual days and was weakly correlated with the length of time during which samples were collected from individual lakes. Observed temporal variation was similar between reservoirs and natural lakes when variances were calculated with logtransformed data. The magnitude of temporal and year-to-year variance can severely limit the power of statistical comparisons of TP and CHLa means, but has less effect on establishing relative rankings of lake means, Sources and relative magnitude of variability are important in the use of TP and CHLa data in regression models and in the planning of lake surveys and subsequent data analysis.  相似文献   

19.
A chance-constrained linear programming model, which utilizes multiple linear decision rules and is useful for river basin planning, is used to evaluate the effects of risk and reliability on optimal reservoir design. Streamflow forecasts or predictions can be explicitly included in the linear program. The risk associated with the predictions is included in the model through the use of cumulative distribution functions (CDF) of streamflows which are conditioned on the predictions. A multiple-purpose reservoir on the Gunpowder River in Maryland is used to illustrate the effectiveness of the model. In order to provide the decision makers with complete and useful information, trade-off curves relating minimum reservoir capacity (a surrogate for dam costs), water supply and flood control targets, and the reliability of achieving the targets are developed. The trade-off curves may enhance the decision maker's ability to select the best dam capacity, considering technological and financial constraints as well as the trade-offs between targets, risks, and costs.  相似文献   

20.
ABSTRACT A synthetic storm rainfall hyetograph for a one-year design frequency is derived from the one-year intensity-duration curve developed for Cincinnati, Ohio. Detailed rainfall data for a three-year period were collected from three raingages triangulating the Bloody Run Sewer Watershed, an urban drainage areas of 2380 acres'in Cincinnati, Ohio. The advancement of the synthetic storm pattern is obtained from an analysis of the antecedent precipitation immediately preceding the maximum period of three selected durations. Rains which produced excessive runoff at least for some duration were considered only. The same approach can be used for other design frequencies. The purpose of this study is to provide synthetic storm hyetographs to be used as input in deterministic mathematical models simulating urban storm water runoff for the design, analysis and possible surcharge prediction of sewer systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号