首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The predictive potential of air quality models and thus their value in emergency management and public health support are critically dependent on the quality of their meteorological inputs. The atmospheric flow is the primary cause of the dispersion of airborne substances. The scavenging of pollutants by cloud particles and precipitation is an important sink of atmospheric pollution and subsequently determines the spatial distribution of the deposition of pollutants. The long-standing problem of the spin-up of clouds and precipitation in numerical weather prediction models limits the accuracy of the prediction of short-range dispersion and deposition from local sources. The resulting errors in the atmospheric concentration of pollutants also affect the initial conditions for the calculation of the long-range transport of these pollutants. Customary the spin-up problem is avoided by only using NWP (Numerical Weather Prediction) forecasts with a lead time greater than the spin-up time of the model. Due to the increase of uncertainty with forecast range this reduces the quality of the associated forecasts of the atmospheric flow.In this article recent improvements through diabatic initialization in the spin-up of large-scale precipitation in the Hirlam NWP model are discussed. In a synthetic example using a puff dispersion model the effect is demonstrated of these improvements on the deposition and dispersion of pollutants with a high scavenging coefficient, such as sulphur, and a low scavenging coefficient, such as cesium-137. The analysis presented in this article leads to the conclusion that, at least for situations where large-scale precipitation dominates, the improved model has a limited spin-up so that its full forecast range can be used. The implication for dispersion modeling is that the improved model is particularly useful for short-range forecasts and the calculation of local deposition. The sensitivity of the hydrological processes to proper initialization implies that the spin-up problem may reoccur with changes in the model and increased model resolution. Spin-up should be an ongoing concern for atmospheric modelers.  相似文献   

2.
A mesoscale atmospheric model PSU/NCAR MM5 is used to provide operational weather forecasts for a nuclear emergency response decision support system on the southeast coast of India. In this study the performance of the MM5 model with assimilation of conventional surface and upper-air observations along with satellite derived 2-d surface wind data from QuickSCAT sources is examined. Two numerical experiments with MM5 are conducted: one with static initialization using NCEP FNL data and second with dynamic initialization by assimilation of observations using four dimensional data assimilation (FDDA) analysis nudging for a pre-forecast period of 12 h. Dispersion simulations are conducted for a hypothetical source at Kalpakkam location with the HYSPLIT Lagrangian particle model using simulated wind field from the above experiments. The present paper brings out the differences in the atmospheric model predictions and the differences in dispersion model results from control and assimilation runs. An improvement is noted in the atmospheric fields from the assimilation experiment which has led to significant alteration in the trajectory positions, plume orientation and its distribution pattern. Sensitivity tests using different PBL and surface parameterizations indicated the simple first order closure schemes (Blackadar, MRF) coupled with the simple soil model have given better results for various atmospheric fields. The study illustrates the impact of the assimilation of the scatterometer wind and automated weather stations (AWS) observations on the meteorological model predictions and the dispersion results.  相似文献   

3.
In previous work [Kovalets, I., Andronopoulos, S., Bartzis, J.G., Gounaris, N., Kushchan, A., 2004. Introduction of data assimilation procedures in the meteorological pre-processor of atmospheric dispersion models used in emergency response systems. Atmospheric Environment 38, 457–467.] the authors have developed data assimilation (DA) procedures and implemented them in the frames of a diagnostic meteorological pre-processor (MPP) to enable simultaneous use of meteorological measurements with numerical weather prediction (NWP) data. The DA techniques were directly validated showing a clear improvement of the MPP output quality in comparison with meteorological measurement data. In the current paper it is demonstrated that the application of DA procedures in the MPP, to combine meteorological measurements with NWP data, has a noticeable positive effect on the performance of an atmospheric dispersion model (ADM) driven by the MPP output. This result is particularly important for emergency response systems used for accidental releases of pollutants, because it provides the possibility to combine meteorological measurements with NWP data in order to achieve more reliable dispersion predictions. This is also an indirect way to validate the DA procedures applied in the MPP. The above goal is achieved by applying the Lagrangian ADM DIPCOT driven by meteorological data calculated by the MPP code both with and without the use of DA procedures to simulate the first European tracer experiment (ETEX I). The performance of the ADM in each case was evaluated by comparing the predicted and the experimental concentrations with the use of statistical indices and concentration plots. The comparison of resulting concentrations using the different sets of meteorological data showed that the activation of DA in the MPP code clearly improves the performance of dispersion calculations in terms of plume shape and dimensions, location of maximum concentrations, statistical indices and time variation of concentration at the detectors locations.  相似文献   

4.
This paper presents results from a series of numerical experiments designed to evaluate operational long-range dispersion model simulations, and to investigate the effect of different temporal and spatial resolution of meteorological data from numerical weather prediction models on these simulations. Results of Lagrangian particle dispersion simulations of the first tracer release of the European Tracer Experiment (ETEX) are presented and compared with measured tracer concentrations. The use of analyzed data of higher resolution from the European Center for Medium-Range Weather Forecasts (ECMWF) model produced significantly better agreement between the concentrations predicted with the dispersion model and the ETEX measurements than the use of lower resolution Navy Operational Global Atmospheric Prediction System (NOGAPS) forecast data. Numerical experiments were performed in which the ECMWF model data with lower vertical resolution (4 instead of 7 levels below 500 mb), lower temporal resolution (12 h instead of 6 h intervals), and lower horizontal resolution (2.5° instead of 0.5°) were used. Degrading the horizontal or temporal resolution of the ECMWF data resulted in decreased accuracy of the dispersion simulations. These results indicate that flow features resolved by the numerical weather prediction model data at approximately 45 km horizontal grid spacing and 6 h time intervals, but not resolved at 225 km spacing and 12 h intervals, made an important contribution to the long-range dispersion.  相似文献   

5.
Several techniques have been developed over the last decade for the ensemble treatment of atmospheric dispersion model predictions. Among them two have received most of the attention, the multi-model and the ensemble prediction system (EPS) modeling. The multi-model approach relies on model simulations produced by different atmospheric dispersion models using meteorological data from potentially different weather prediction systems. The EPS-based ensemble is generated by running a single atmospheric dispersion model with the ensemble weather prediction members. In the paper we compare both approaches with the help of statistical indicators, using the simulations performed for the ETEX-1 tracer experiment. Both ensembles are also evaluated against measurement data. Among the most relevant results is that the multi-model median and the mean of EPS-based ensemble produced the best results, hence we consider a combination of multi-model and EPS-based approaches as an interesting suggestion for further research.  相似文献   

6.
Source term estimation algorithms compute unknown atmospheric transport and dispersion modeling variables from concentration observations made by sensors in the field. Insufficient spatial and temporal resolution in the meteorological data as well as inherent uncertainty in the wind field data make source term estimation and the prediction of subsequent transport and dispersion extremely difficult. This work addresses the question: how many sensors are necessary in order to successfully estimate the source term and meteorological variables required for atmospheric transport and dispersion modeling?The source term estimation system presented here uses a robust optimization technique – a genetic algorithm (GA) – to find the combination of source location, source height, source strength, surface wind direction, surface wind speed, and time of release that produces a concentration field that best matches the sensor observations. The approach is validated using the Gaussian puff as the dispersion model in identical twin numerical experiments. The limits of the system are tested by incorporating additive and multiplicative noise into the synthetic data. The minimum requirements for data quantity and quality are determined by an extensive grid sensitivity analysis. Finally, a metric is developed for quantifying the minimum number of sensors necessary to accurately estimate the source term and to obtain the relevant wind information.  相似文献   

7.
In October 1957 a fire in Pile Number 1, a nuclear reactor at the Windscale Works, Sellafield, resulted in the accidental release of radionuclides to the atmosphere. Previous studies have described the atmospheric transport of the resultant radioactive plume from its release on the Cumbrian coast of Northwest England to its passage across mainland Europe. Those past studies have suffered from uncertainties concerning the quantity and timing of emissions and meteorological conditions. Crabtree [1959. The travel and diffusion of the radioactive material emitted during the Windscale accident. Quarterly Journal of the Royal Meteorological Society 85, 362] initially produced estimates of plume transport based on weather observations and radiosonde profiles. Later, ApSimon et al. [1985. Long-range atmospheric dispersion of radioisotopes—I. The MESOS model. Atmospheric Environment 19(1), 99–111] based estimates of plume transport on trajectories calculated from weather charts. More recently, Nelson et al. [2006. A study of the movement of radioactive material discharged during the Windscale fire in October 1957. Atmospheric Environment, 40, 58–75] used a full three-dimensional dispersion model using digitised weather data from similar charts.This study aims to further reduce uncertainty in the plume's behaviour by using the latest available Numerical Weather Prediction Model reanalysis of meteorological data from the European Centre for Medium Range Weather Forecasts (ERA-40) coupled with current best estimates of the radioactive emissions profile. The results presented here generally support the findings of previous studies though an improvement in model comparisons against observational measurements has been found. The opportunity was also taken to extend the time horizon, and hence geographical coverage, of the modelled plume. It is considered that this paper presents the best estimate to date of the plume's behaviour.  相似文献   

8.
In this paper the meteorological processes responsible for transporting tracer during the second ETEX (European Tracer EXperiment) release are determined using the UK Met Office Unified Model (UM). The UM predicted distribution of tracer is also compared with observations from the ETEX campaign. The dominant meteorological process is a warm conveyor belt which transports large amounts of tracer away from the surface up to a height of 4 km over a 36 h period. Convection is also an important process, transporting tracer to heights of up to 8 km. Potential sources of error when using an operational numerical weather prediction model to forecast air quality are also investigated. These potential sources of error include model dynamics, model resolution and model physics. In the UM a semi-Lagrangian monotonic advection scheme is used with cubic polynomial interpolation. This can predict unrealistic negative values of tracer which are subsequently set to zero, and hence results in an overprediction of tracer concentrations. In order to conserve mass in the UM tracer simulations it was necessary to include a flux corrected transport method. Model resolution can also affect the accuracy of predicted tracer distributions. Low resolution simulations (50 km grid length) were unable to resolve a change in wind direction observed during ETEX 2, this led to an error in the transport direction and hence an error in tracer distribution. High resolution simulations (12 km grid length) captured the change in wind direction and hence produced a tracer distribution that compared better with the observations. The representation of convective mixing was found to have a large effect on the vertical transport of tracer. Turning off the convective mixing parameterisation in the UM significantly reduced the vertical transport of tracer. Finally, air quality forecasts were found to be sensitive to the timing of synoptic scale features. Errors in the position of the cold front relative to the tracer release location of only 1 h resulted in changes in the predicted tracer concentrations that were of the same order of magnitude as the absolute tracer concentrations.  相似文献   

9.
The Savannah River National Laboratory (SRNL) Weather Information and Display System was used to provide meteorological and atmospheric modeling/consequence assessment support to state and local agencies after the collision of two Norfolk Southern freight trains on the morning of January 6, 2005. This collision resulted in the release of several toxic chemicals to the environment, including chlorine. The dense and highly toxic cloud of chlorine gas that formed in the vicinity of the accident was responsible for 9 fatalities and caused injuries to more than 500 others. Transport model results depicting the forecast path of the ongoing release were made available to emergency managers in the county's Unified Command Center shortly after SRNL received a request for assistance. Support continued over the ensuing 2 days of the active response. The SRNL also provided weather briefings and transport/consequence assessment model results to responders from the South Carolina Department of Health and Environmental Control, the Savannah River Site (SRS) Emergency Operations Center, Department of Energy headquarters, and hazard material teams dispatched from the SRS. Operational model-generated forecast winds used in consequence assessments conducted during the incident were provided at 2-km horizontal grid spacing during the accident response. High-resolution Regional Atmospheric Modeling System (RAMS, version 4.3.0) simulation was later performed to examine potential influences of local topography on plume migration in greater detail. The detailed RAMS simulation was used to determine meteorology using multiple grids with an innermost grid spacing of 125 m. Results from the two simulations are shown to generally agree with meteorological observations at the time; consequently, local topography did not significantly affect wind in the area. Use of a dense gas dispersion model to simulate localized plume behavior using the higher-resolution winds indicated agreement with fatalities in the immediate area and visible damage to vegetation.  相似文献   

10.
In order to incorporate correctly the large or local scale circulation in the model, a nudging term is introduced into the equation of motion. Nudging effects should be included properly in the model to reduce the uncertainties and improve the air flow field. To improve the meteorological components, the nudging coefficient should perform the adequate influence on complex area for the model initialization technique which related to data reliability and error suppression. Several numerical experiments have been undertaken in order to evaluate the effects on air quality modeling by comparing the performance of the meteorological result with variable nudging coefficient experiment. All experiments are calculated by the upper wind conditions (synoptic or asynoptic condition), respectively. Consequently, it is important to examine the model response to nudging effect of wind and mass information. The MM5–CMAQ model was used to assess the ozone differences in each case, during the episode day in Seoul, Korea and we revealed that there were large differences in the ozone concentration for each run.These results suggest that for the appropriate simulation of large or small-scale circulations, nudging considering the synoptic and asynoptic nudging coefficient does have a clear advantage over dynamic initialization, so appropriate limitation of these nudging coefficient values on its upper wind conditions is necessary before making an assessment. The statistical verifications showed that adequate nudging coefficient for both wind and temperature data throughout the model had a consistently positive impact on the atmospheric and air quality field. On the case dominated by large-scale circulation, a large nudging coefficient shows a minor improvement in the atmospheric and air quality field. However, when small-scale convection is present, the large nudging coefficient produces consistent improvement in the atmospheric and air quality field.  相似文献   

11.
In order to realistically simulate both chemistry and transport of atmospheric organic pollutants, it is indispensable that the applied models explicitly include coupling between different components of the global environment such as atmosphere, hydrosphere, cryosphere and soil system. A model with such properties is presented.

The atmospheric part of the model is based on the equations in a general contravariant form which permits easy changes of the coordinate system by redefining the metric tensor of a specifically employed coordinate system. Considering a need to include explicitly the terrain effects, the terrain following spherical coordinate system is chosen from among many possible coordinate systems. This particular system is a combination of the Gal-Chen coordinates, commonly employed in mesoscale meteorological models, and the spherical coordinates, typical for global atmospheric models.

In addition to atmospheric transport, the model also simulates the exchange between air and different types of underlying surfaces such as water, soil, snow, and ice. This approach permits a realistic representation of absorption and delayed re-emission of pollutants from the surface to the atmosphere and, consequently, allows to capture hysteresis-like effects of the exchange between the atmosphere and the other components of the system. In this model, the most comprehensive numerical representation of the exchange is that for soil. In particular, the model includes a realistic soil module which simulates both diffusion and convection of a tracer driven by evaporation from the soil, precipitation, and gravity.

The model is applied to a long-term simulation of the transport of pesticides (hexachlorocyclohexanes in particular). Emission fluxes from the soil are rigorously computed on the basis of the realistic data of the agricultural application. All four modelled systems, i.e. atmosphere, soil, hydrosphere and cryosphere, are driven by objectively analysed meteorological data supplemented, when necessary, by climatological information. Therefore, the verification against the observed data is possible. The comparison of the model results and the observations taken at remote stations in the Arctic indicates that the presented global modelling system is able to capture both trends and short-term components in the observed time series of the concentrations, and therefore, provides a useful tool for the evaluation of the source–receptor relationships.  相似文献   


12.
The Norwegian Meteorological Institute (DNMI) has developed and implemented for operational use a real-time dispersion model Severe Nuclear Accident Program (SNAP) with capability for predicting concentrations and depositions of the radioactive debris from large accidental releases. SNAP has been closely linked to DNMI’s operational numerical weather prediction (NWP) models.How good are these predictions? Participation in ETEX has partly answered this question. DNMI used SNAP with LAM50S giving meteorological input for these real-time dispersion calculations. LAM50S Limited Area Model with 50 km grid squareswas DNMI’s operational NWP model in 1994 when ETEX took place.In this article we report on how SNAP performed in the first of the ETEX releases in near-real-time mode, using LAM50S—and in hindcast mode for ATMES II, using “ECMWF 1995: ETEX Data set (ATMES II)”as meteorological input data. These two input data sets came from NWP models with quite different characteristics but with similar resolution in time and space.The results from these dispersion simulations matched closely. Deviations early in the simulation period shrank to insignificant differences later on. Since both input data sets were based on “weather analysis” and had similar resolution in space and time, SNAP described the dispersion of the released material very similar in these two simulations.  相似文献   

13.
Japan Atomic Energy Research Institute has developed an emergency response system WSPEEDI to forecast long-range atmospheric dispersions of radionuclides discharged into the atmosphere. The latest version of WSPEEDI consists of an atmospheric dynamic model MM5 for calculating meteorological fields and a particle random-walk model for atmospheric dispersion. The performance of WSPEEDI was evaluated by data obtained from a field tracer experiment over Europe (ETEX) in this paper. The model validation was done with respect to the following points: (1) the dependence of model accuracy on the temporal and spatial resolutions of the meteorological fields and (2) the superiority of an atmospheric dynamic model over a mass-consistent wind model. Regarding (1), it was shown that the calculation accuracy of the new version with high temporal resolution was improved, especially at the edge of the plume. Moreover, although the increase in horizontal spatial resolution of the old version had no substantial effect on the model performance, increase in horizontal resolution of the new version contributed to the significant improvement of the calculation accuracy. These results showed that the dynamically calculated meteorological field with the spatial resolution of the meso-βγ scale greatly improved calculation accuracy.  相似文献   

14.
Ozone prediction has become an important activity in many U.S. ozone nonattainment areas. In this study, we describe the ozone prediction program in the Atlanta metropolitan area and analyze the performance of this program during the 1999 ozone-forecasting season. From May to September, a team of 10 air quality regulators, meteorologists, and atmospheric scientists made a daily prediction of the next-day maximum 8-hr average ozone concentration. The daily forecast was made aided by two linear regression models, a 3-dimensional air quality model, and the no-skill ozone persistence model. The team's performance is compared with the numerical models using several numerical indicators. Our analysis indicated that (1) the team correctly predicted next-day peak ozone concentrations 84% of the time, (2) the two linear regression models had a better performance than a 3-dimensional air quality model, (3) persistence was a strong predictor of ozone concentrations with a performance of 78%, and (4) about half of the team's wrong predictions could be prevented with improved meteorological predictions.  相似文献   

15.
As part of the European Tracer Experiment (ETEX) two successful atmospheric experiments were carried out in October and November, 1994. Perfluorocarbon (PFC) tracers were released into the atmosphere in Monterfil, Brittany, and air samples were taken at 168 stations in 17 European countries for 72 h after the release. Upper air tracer measurements were made from three aircraft. During the first experiment a westerly air flow transported the tracer plume north-eastwards across Europe. During the second release the flow was eastwards. The results from the ground sampling network allowed the determination of the cloud evolution as far as Sweden, Poland and Bulgaria. This demonstrated that the PFT technique can be successfully applied in long-range tracer experiments up to 2000 km. Typical background concentrations of the tracer used are around 5–7 fl ?-1 in ambient air. Concentrations in the plume ranged from 10 to above 200 fl/?-1. The tracer release characteristics, the tracer concentrations at the ground and in upper air, the routine and additional meteorological observations at the ground level and in upper air, trajectories derived from constant-level balloons and the meteorological input fields for long-range transport models are assembled in the ETEX database. The ETEX database is accessible via the Internet. Here, an overview is given of the design of the experiment, the methods used and the data obtained.  相似文献   

16.
The Eulerian atmospheric tracer transport model MATCH (Multiscale Atmospheric Transport and Chemistry model) has been extended with a Lagrangian particle model treating the initial dispersion of pollutants from point sources. The model has been implemented at the Swedish Meteorological and Hydrological Institute in an emergency response system for nuclear accidents and can be activated on short notice to provide forecast concentration and deposition fields.The model has been used to simulate the transport of the inert tracer released during the ETEX experiment and the transport and deposition of 137Cs from the Chernobyl accident. Visual inspection of the results as well as statistical analysis shows that the extent, time of arrival and duration of the tracer cloud, is in good agreement with the observations for both cases, with a tendency towards over-prediction for the first ETEX release. For the Chernobyl case the simulated deposition pattern over Scandinavia and over Europe as a whole agrees with observations when observed precipitation is used in the simulation. When model calculated precipitation is used, the quality of the simulation is reduced significantly and the model fails to predict major features of the observed deposition field.  相似文献   

17.
In this paper, an attempt is made for the 24-hr prediction of photochemical pollutant levels using a neural network model. For this purpose, a model is developed that relates peak pollutant concentrations to meteorological and emission variables and indexes. The analysis is based on measurements of O3 and NO2 from the city of Athens. The meteorological variables are selected to cover atmospheric processes that determine the fate of the airborne pollutants while special care is taken to ensure the availability of the required input data from routine observations or forecasts. The comparison between model predictions and actual observations shows a good agreement. In addition, a series of sensitivity tests is performed in order to evaluate the sensitivity of the model to the uncertainty in meteorological variables. Model forecasts are generally rather insensitive to small perturbations in most of the input meteorological data, while they are relatively more sensitive in changes in wind speed and direction.  相似文献   

18.
Events of high concentration of ground-level ozone constitute a matter of major concern in large urban areas in terms of air quality, and public health. In the Sao Paulo Metropolitan Area (SPMA), air quality data generated by a network of air quality measuring stations have been used in a number of studies correlating ozone formation with different variables. A study was carried out on the application of neural network models in the identification of typical sceneries leading to high ground-level ozone concentrations in the SPMA. The results were then applied in the selection of variables, and in the definition of neural network-based models for estimating ozone levels from meteorological variables. When combined with existing weather prediction tools, the models can be applied in the prediction of ozone levels in the SPMA  相似文献   

19.
The management of tropospheric ozone (O3) is particularly difficult. The formulation of emission control strategies requires considerable information including: (1) emission inventories, (2) available control technologies, (3) meteorological data for critical design episodes, and (4) computer models that simulate atmospheric transport and chemistry. The simultaneous consideration of this information during control strategy design can be exceedingly difficult for a decision-maker. Traditional management approaches do not explicitly address cost minimization. This study presents a new approach for designing air quality management strategies; a simple air quality model is used conjunctively with a complex air quality model to obtain low-cost management strategies. A simple air quality model is used to identify potentially good solutions, and two heuristic methods are used to identify cost-effective control strategies using only a small number of simple air quality model simulations. Subsequently, the resulting strategies are verified and refined using a complex air quality model. The use of this approach may greatly reduce the number of complex air quality model runs that are required. An important component of this heuristic design framework is the use of the simple air quality model as a screening and exploratory tool. To achieve similar results with the simple and complex air  相似文献   

20.
ABSTRACT

Ozone prediction has become an important activity in many U.S. ozone nonattainment areas. In this study, we describe the ozone prediction program in the Atlanta metropolitan area and analyze the performance of this program during the 1999 ozone-forecasting season. From May to September, a team of 10 air quality regulators, meteorologists, and atmospheric scientists made a daily prediction of the next-day maximum 8-hr average ozone concentration. The daily forecast was made aided by two linear regression models, a 3-dimensional air quality model, and the no-skill ozone persistence model. The team's performance is compared with the numerical models using several numerical indicators. Our analysis indicated that (1) the team correctly predicted next-day peak ozone concentrations 84% of the time, (2) the two linear regression models had a better performance than a 3-dimensional air quality model, (3) persistence was a strong predictor of ozone concentrations with a performance of 78%, and (4) about half of the team's wrong predictions could be prevented with improved meteorological predictions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号