首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Empirical tsunami fragility curves are developed based on a Bayesian framework by accounting for uncertainty of input tsunami hazard data in a systematic and comprehensive manner. Three fragility modeling approaches, i.e. lognormal method, binomial logistic method, and multinomial logistic method, are considered, and are applied to extensive tsunami damage data for the 2011 Tohoku earthquake. A unique aspect of this study is that uncertainty of tsunami inundation data (i.e. input hazard data in fragility modeling) is quantified by comparing two tsunami inundation/run-up datasets (one by the Ministry of Land, Infrastructure, and Transportation of the Japanese Government and the other by the Tohoku Tsunami Joint Survey group) and is then propagated through Bayesian statistical methods to assess the effects on the tsunami fragility models. The systematic implementation of the data and methods facilitates the quantitative comparison of tsunami fragility models under different assumptions. Such comparison shows that the binomial logistic method with un-binned data is preferred among the considered models; nevertheless, further investigations related to multinomial logistic regression with un-binned data are required. Finally, the developed tsunami fragility functions are integrated with building damage-loss models to investigate the influences of different tsunami fragility curves on tsunami loss estimation. Numerical results indicate that the uncertainty of input tsunami data is not negligible (coefficient of variation of 0.25) and that neglecting the input data uncertainty leads to overestimation of the model uncertainty.  相似文献   

2.
《国际泥沙研究》2022,37(5):601-618
Landslides are considered as one among many phenomena jeopardizing human beings as well as their constructions. To prevent this disastrous problem, researchers have used several approaches for landslide susceptibility modeling, for the purpose of preparing accurate maps marking landslide prone areas. Among the most frequently used approaches for landslide susceptibility mapping is the Artificial Neural Network (ANN) method. However, the effectiveness of ANN methods could be enhanced by using hybrid metaheuristic algorithms, which are scarcely applied in landslide mapping. In the current study, nine hybrid metaheuristic algorithms, genetic algorithm (GA)-ANN, evolutionary strategy (ES)-ANN, ant colony optimization (ACO)-ANN, particle swarm optimization (PSO)-ANN, biogeography based optimization (BBO)-ANN, gravitational search algorithm (GHA)-ANN, particle swarm optimization and gravitational search algorithm (PSOGSA)-ANN, grey wolves optimization (GWO)-ANN, and probability based incremental learning (PBIL)-ANN have been used to spatially predict landslide susceptibility in Algiers’ Sahel, Algeria. The modeling phase was done using a database of 78 landslides collected utilizing Google Earth images, field surveys, and six conditioning factors (lithology, elevation, slope, land cover, distance to stream, and distance to road). Initially, a gamma test was used to decrease the input variable numbers. Furthermore, the optimal inputs have been modeled by the mean of hybrid metaheuristic ANN techniques and their performance was assessed through seven statistical indicators. The comparative study proves the effectiveness of the co-evolutionary PSOGSA-ANN model, which yielded higher performance in predicting landslide susceptibility compared to the other models. Sensitivity analysis using the step-by-step technique was done afterward, which revealed that the distance to the stream is the most influential factor on landslide susceptibility, followed by the slope factor which ranked second. Lithology and the distance to road have demonstrated a moderate effect on landslide susceptibility. Based on these findings, an accurate map has been designed to help land-use managers and decision-makers to mitigate landslide hazards.  相似文献   

3.
The aim of this study was to apply, verify and compare a multiple logistic regression model for landslide susceptibility analysis in three Korean study areas using a geographic information system (GIS). Landslide locations were identified by interpreting aerial photographs, satellite images and a field survey. Maps of the topography, soil type, forest cover, lineaments and land cover were constructed from the spatial data sets. The 14 factors that influence landslide occurrence were extracted from the database and the logistic regression coefficient of each factor was computed. Landslide susceptibility maps were drawn for these three areas using logistic regression coefficients derived not only from the data for that area but also using those for each of the other two areas (nine maps in all) as a cross‐check of method validity. For verification, the results of the analyses were compared with actual landslide locations. Among the nine cases, the Janghung exercise using the logistic formula and the coefficient for Janghung had the greatest accuracy (88·44%), whereas Janghung results, when considered by the logistic formula and the coefficient for Boeun, had the least accuracy (74·16%). Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

4.
为了合理计算山区桥梁支座刚度,针对桥墩高度不相同的特点,考虑上部结构对桥墩顶部的转动约束作用,提出在横桥向可将墩顶视为自由约束,而在纵桥向将墩顶视为定向约束.分别按照地震作用下各墩底剪力和弯矩相等的原则,推导桥梁支座纵、横桥向的刚度设计公式,并给出各桥墩支座的设计方法.为验证方法的正确性,以墩底剪力相等的原则为例,利用...  相似文献   

5.
6.
7.
8.
Recently, alternative models to estimate the age of diagenetically altered fossil reef corals have been presented based on either redistribution of U or its immediate daughters 234Th and 230Th. Here, we present three methods to estimate the uncertainty of ages derived using an amended version of our coral isochron method [Scholz et al., 2004. U-series dating of diagenetically altered fossil reef corals. Earth and Planetary Science Letters 218, 163–178], which is based on addition/loss of U. The obtained uncertainties are substantially larger than those previously published and should, in general, be more reliable. The isochron method yields larger uncertainties than alternative models based on Th redistribution due to -recoil processes. However, comparison of model open-system ages based on such redistribution of U-series daughters for different sub-samples from an individual coral specimen shows that the smaller errors derived with these models cannot account for the observed variability. We recognise that none of the available models is applicable to all corals, probably reflecting different diagenetic processes even in different sub-samples from one coral specimen. To better understand the diagenetic processes and precisely constrain the uncertainties of the ages derived from diagenetically altered corals, the application of all available models is recommended.  相似文献   

9.
Abstract

Recent work pertaining to estimating error and accuracies in geomagnetic field modeling is reviewed from a unified viewpoint and illustrated with examples. The formulation of a finite dimensional approximation to the underlying infinite dimensional problem is developed. Central to the formulation is an inner product and norm in the solution space through which a priori information can be brought to bear on the problem. Such information is crucial to estimation of the effects of higher degree fields at the Core-Mantle boundary (CMB) because the behavior of higher degree fields is masked in our measurements by the presence of the field from the Earth's crust. Contributions to the errors in predicting geophysical quantities based on the approximate model are separated into three categories: (1) the usual error from the measurement noise; (2) the error from unmodeled fields, i.e. from sources in the crust, ionosphere, etc.; and (3) the error from truncating to a finite dimensioned solution and prediction space. The combination of the first two is termed low degree error while the third is referred to as truncation error.

The error analysis problem consists of “characterizing” the difference δz = z—z, where z is some quantity depending on the magnetic field and z is the estimate of z resulting from our model. Two approaches are discussed. The method of Confidence Set Inference (CSI) seeks to find an upper bound for |z—?|. Statistical methods, i.e. Bayesian or Stochastic Estimation, seek to estimate Ez2 ), where E is the expectation value. Estimation of both the truncation error and low degree error is discussed for both approaches. Expressions are found for an upper bound for |δz| and for Ez2 ). Of particular interest is the computation of the radial field, B., at the CMB for which error estimates are made as examples of the methods. Estimated accuracies of the Gauss coefficients are given for the various methods. In general, the lowest error estimates result when the greatest amount of a priori information is available and, indeed, the estimates for truncation error are completely dependent upon the nature of the a priori information assumed. For the most conservative approach, the error in computing point values of Br at the CMB is unbounded and one must be content with, e.g., averages over some large area. The various assumptions about a priori information are reviewed. Work is needed to extend and develop this information. In particular, information regarding the truncated fields is needed to determine if the pessimistic bounds presently available are realistic or if there is a real physical basis for lower error estimates. Characterization of crustal fields for degree greater than 50 is needed as is more rigorous characterization of the external fields.  相似文献   

10.
This paper presents a framework to quantify the overall variability of the model estimations of Total Polychlorinated Biphenyls (Total PCBs) concentrations in the Niagara River on the basis of the uncertainty of few model parameters and the natural variability embedded in some of the model input variables. The results of the uncertainty analysis are used to understand the importance of stochastic model components and their effect on the overall reliability of the model output and to evaluate multiple sources of uncertainty that might need to be further studied. The uncertainty analysis is performed using a newly developed point estimate method, the Modified Rosenblueth method. The water quality along the Niagara River is simulated by coupling two numerical models the Environmental Fluid Dynamic Code (EFDC) – for the hydrodynamic portion of the study and the Water Quality Analysis and Simulation Program (WASP) – for the fate and transport of contaminants. For the monitoring period from May 1995 to March 1997, the inflow Total PCBs concentration from Lake Erie is the stochastic component that most influences the variability of the modeling results for the simulated concentrations at the exit of the Niagara River. Other significant stochastic components in order are as follows: the suspended sediments concentration, the point source loadings and to a minor degree the atmospheric deposition, the flow and the non-point source loadings. Model results that include estimates of uncertainty provide more comprehensive information about the variability of contaminant concentrations, such as confidence intervals, and, in general offer a better approach to compare model results with measured data.  相似文献   

11.
12.
A Monte Carlo-based approach to assess uncertainty in recharge areas shows that incorporation of atmospheric tracer observations (in this case, tritium concentration) and prior information on model parameters leads to more precise predictions of recharge areas. Variance-covariance matrices, from model calibration and calculation of sensitivities, were used to generate parameter sets that account for parameter correlation and uncertainty. Constraining parameter sets to those that met acceptance criteria, which included a standard error criterion, did not appear to bias model results. Although the addition of atmospheric tracer observations and prior information produced similar changes in the extent of predicted recharge areas, prior information had the effect of increasing probabilities within the recharge area to a greater extent than atmospheric tracer observations. Uncertainty in the recharge area propagates into predictions that directly affect water quality, such as land cover in the recharge area associated with a well and the residence time associated with the well. Assessments of well vulnerability that depend on these factors should include an assessment of model parameter uncertainty. A formal simulation of parameter uncertainty can be used to delineate probabilistic recharge areas, and the results can be expressed in ways that can be useful to water-resource managers. Although no one model is the correct model, the results of multiple models can be evaluated in terms of the decision being made and the probability of a given outcome from each model.  相似文献   

13.
Considering complexity in groundwater modeling can aid in selecting an optimal model, and can avoid over parameterization, model uncertainty, and misleading conclusions. This study was designed to determine the uncertainty arising from model complexity, and to identify how complexity affects model uncertainty. The Ajabshir aquifer, located in East Azerbaijan, Iran, was used for comprehensive hydrogeological studies and modeling. Six unique conceptual models with four different degrees of complexity measured by the number of calibrated model parameters (6, 10, 10, 13, 13 and 15 parameters) were compared and characterized with alternative geological interpretations, recharge estimates and boundary conditions. The models were developed with Model Muse and calibrated using UCODE with the same set of observed data of hydraulic head. Different methods were used to calculate model probability and model weight to explore model complexity, including Bayesian model averaging, model selection criteria, and multicriteria decision-making (MCDM). With the model selection criteria of AIC, AICc and BIC, the simplest model received the highest model probability. The model selection criterion, KIC, and the MCDM method, in addition to considering the quality of model fit between observed and simulated data and the number of calibrated parameters, also consider uncertainty in parameter estimates with a Fisher information matrix. KIC and MCDM selected a model with moderate complexity (10 parameters) and the best parameter estimation (model 3) as the best models, over another model with the same degree of complexity (model 2). The results of these comparisons show that in choosing between models, priority should be given to quality of the data and parameter estimation rather than degree of complexity.  相似文献   

14.
The impact of uncertainty in ground elevation on the extent of areas that are inundated due to flooding is investigated. Land surface is represented through a Digital Surface Model (DSM). The effect of uncertainty in DSM is compared to that of the uncertainty due to rainfall. The Monte Carlo method is used to quantify the uncertainty. A typical photogrammetric procedure and conventional maps are used to obtain a reference DSM, later altered to provide DSMs of lower accuracy. Also, data from the Shuttle Radar Topography Mission are used. Floods are simulated in two stages. In the first stage, flood hydrographs for typical return periods are synthesized using generated storm hyetographs, the Soil Conservation Service–Curve Number method for effective rainfall, and the Soil Conservation Service synthetic unit hydrograph. In the second stage, hydrographs are routed via a one‐dimensional hydraulic model. Uncertainty in DSM is considered only in the second stage. Data from two real‐world basins in Greece are used. To characterize the inundated area, we employ the 90% quantile of the inundation extent and inundation topwidth for peak water level at specific river cross‐sections. For topwidths, apart from point estimates, also interval estimates are acquired using the bootstrap method. The effect of DSM uncertainty is compared to that for rainfall. Low uncertainty in DSM is found to widen the inundated area; whereas, the opposite occurred with high uncertainty. SRTM data proved unsuitable for our test basins and modelling context.  相似文献   

15.
Abstract

Small dams represent an important local-scale resource designed to increase water supply reliability in many parts of the world where hydrological variability is high. There is evidence that the number of farm dams has increased substantially over the last few decades. These developments can have a substantial impact on downstream flow volumes and patterns, water use and ecological functioning. The study reports on the application of a hydrological modelling approach to investigate the uncertainty associated with simulating the impacts of farm dams in several South African catchments. The focus of the study is on sensitivity analysis and the limitations of the data that would be typically available for water resources assessments. The uncertainty mainly arises from the methods and information that are available to estimate the dam properties and the water use from the dams. The impacts are not only related to the number and size of dams, but also the extent to which they are used for water supply as well as the nature of the climate and the natural hydrological regimes. The biggest source of uncertainty in South Africa appears to be associated with a lack of reliable information on volumes and patterns of water abstraction from the dams.

Citation Hughes, D. A. & Mantel, S. K. (2010) Estimating the uncertainty in simulating the impacts of small farm dams on streamflow regimes in South Africa. Hydrol. Sci. J. 55(4), 578–592.  相似文献   

16.
The result of tree-ring-based reconstruction of past landslide events is often the development of a single total chronology. This approach can be very effective for small homogeneous landslides. However, compiling chronological data from heterogeneous (often independent) zones of large complex landslide areas into one chronology can induce over- or underestimation of some events, resulting in lowered reliability of the reconstruction. The solution for elimination of this effect can lie in the diversification of complex landslide areas into homogeneous zones with separate analyses. The aim of this study was to quantify the effect of this separation on detected slope movement events and to define parameters whose investigation could distinguish events (sliding) from noise (creeping).For this purpose, 412 tree-ring series from 206 disturbed common spruce (Picea abies (L.) Karst.) occupying complex landslide areas were dendrogeomorphically analysed. The landslide area was divided into five homogeneous zones using geomorphic mapping, LiDAR-based DEM and geophysical sounding (ERT). Five events (verified in individual zones) were detected in the total chronology. Two extra events in the total chronology (28.6%) were considered noise. Moreover, two zonal events were detected but not recorded in the total chronology. This indicates that the noise in the total chronology of the complex landslide area could reach more than a quarter of dated events. Next, true slide events and noise (caused by creep) were differentiated in the structure of growth disturbances (reaction wood vs. abrupt growth suppression) and their proportion in event reconstruction, spatial patterns of trees containing slope movement signals, and the character of triggers. Thus, for better filtering of noise from signals in tree-ring-based chronologies of landslides, not only observations of dendrogeomorphic index values but also the morphology of landslides and characteristics of dated processes must be considered.  相似文献   

17.
Hydrologic cycle is a complex system associated with both certain and uncertain constituents. The propagation of confidence bounds from different uncertainty sources to model output is of great significance for hydrologic modeling. In this paper, we applied the integrated bayesian uncertainty estimator to quantify the effects of parameter, input and model structure uncertainty on hydrologic modeling progressively. Two hydrologic models (Xinanjiang model and TOPMODEL) were applied to a humid catchment under three scenarios. Case I: the shuffled complex evolution metropolis (SCEM-UA) algorithm was conducted to determine the posterior parameter distribution of hydrologic models and analyze the corresponding forecast uncertainty. Case II: input uncertainty was also considered by assuming rain depth bias follows a normal distribution, and integrated with SCEM-UA. Case III: Simulations from two models were combined by the Bayesian model averaging to fully quantify multisource uncertainty effects. Results suggested that, from Case I to II, the containing ratio (percentage of observed streamflow enveloped by 95% confidence interval) obviously increased by an average magnitude of 10% for the study period 2000–2006. Besides, it also found that the width of 95% confidence interval became wider and narrower for Xinanjiang model and TOPMODEL, respectively, from Case I to II. This may indicate that the uncertainty of TOPMODEL results was more remarkable than Xinanjiang model in Case I. By combining results from two models, model structure uncertainty was also considered in Case III. The accuracy of uncertainty bounds further improved with the containing ratio of 95% confidence interval >95%. In addition, the optimized deterministic results from the uncertainty analysis showed that the average Nash–Sutcliffe coefficient increased continually from Case I to II and III (0.82, 0.84 and 0.90, respectively) for the study period. The analysis demonstrated the improvement of modeling accuracy when extra uncertainty sources were also quantified, and this finding also proved the applicability of IBUNE framework in hydrologic modeling.  相似文献   

18.
In 1972, V. Keilis-Borok and I. Gelfand introduced the phenomenological approach based on the morphostructural zoning and pattern recognition for identification of earthquake-prone areas. This methodology identifies seismogenic nodes capable of generating strong earthquakes on the basis of geological, morphological, and geophysical data, which do not contain information on past seismicity. In the period 1972–2018, totally, 26 worldwide seismic regions have been studied and maps showing the recognized earthquake-prone areas in each region have been published. After that, 11 of these regions were hit by earthquakes of the relevant sizes. The goal of this work is to analyze the correlation of the post-publication events with seismogenic nodes defined in these 11 regions. The test was performed using the NEIC earthquake catalog because it uniformly defines the location and magnitudes of earthquakes over the globe. The ArcMap facilities were exploited to plot the post-publication events on the maps showing the recognized seismogenic nodes. We found that about 86% of such events fall in the recognized seismogenic nodes. The performed test proved the sufficient validity of the methodology for identifying areas capable of strong earthquakes and confirms the idea on nucleating strong earthquakes at the nodes.  相似文献   

19.
In applications of the weight of evidence (WofE) method, the informational redundancy in similar evidential patterns causes a significant increase in the posterior probability. Consequently, to estimate the posterior probability, combinations that pass the established conditional independence (CI) tests are considered rather than the combination of the ‘best’ information layers. This study introduces two methodological approaches to extend the WofE using a correction factor that eliminates the informational redundancy that is contained in different evidential layers. The proposed approaches allow the use of associated data in the same model without having to address issues with the constraints of the CI. The basic WofE approach that is used to estimate the weights is not changed, and only the interactions of the parameter layers and the transformation of the weights into probability values are considered. The method is applied to a real dataset that is used in a landslide susceptibility analysis on Lombok Island, Indonesia.  相似文献   

20.
Benzene exposure is of particular concern because recent research indicates that it can result in chronic toxicity, with an elevated risk of carcinogenesis. Exposure to benzene from automobile exhaust can be an important occupational problem for urban population. The present study was conducted to estimate cancer risk of population living in nine densely traffic jam area due to contact with traffic benzene vapor during daily work. The reported lifetime unit risk factor ranges from 8.30E-8 to 1.58E-6. Of interest, this number is high and can be important public health threaten. Monitoring of environmental benzene can help classify with high risk. Annual check up and monitoring for benzene exposure among the people living in urban area should be set as primary prevention of benzene-related cancer for them.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号