首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
Diffusion Monte Carlo simulations on the subject of bosonic hard rods trapped in one dimensional optical lattices are presented. The existence of Mott Insulator phases for integer fillings is deduced from the behavior of the energy per particle as a function of the density. The analysis of the pair distribution functions and structure factors suggest the existence of fluid-like phases for shallow optical lattices that evolve to solid-like arrangements when the confinement is deep enough.  相似文献   

2.
The effectiveness of Monte Carlo simulation relative to intelligent search strategies for solving block layout problems is investigated. For testing purposes, 810 block layout problems are constructed to span a wide range of problem sizes, material flow variation levels, work centre space requirements distributions, and work centre shape distributions. Contrary to preliminary results reported in earlier studies, greedy search and simulated annealing consistently outperform Monte Carlo Simulation across the full range of test problems and sample sizes. This divergence is explained through a comparison based on probabilistic derivations between the proportion of good solutions sampled by the Monte Carlo method and those found by the heuristic search methods. Conditions for the superiority of either method are identified. Therefore, the current study complements earlier studies by providing analytical arguments and additional experimental evidence for the effectiveness of simple Monte Carlo method and intelligent search heuristics on solving layout problems.  相似文献   

3.
Two test statistics are suggested for discriminating between the exponential model and the more general Weibull or gamma models, and these are compared to some previously used test statistics by Monte Carlo methods. The results of estimating reliability under an exponential assumption when the true model is Weibull is also investigated. These results as well as the tests mentioned above indicate that the exponential model is often not adequate when the more general models hold. In contrast to this result it was found that the Weibull model was quite robust relative to the generalized gamma distribution with regard to reliability estimation. Some general pivotal function properties are presented for the maximum likelihood estimator of reliability for the generalized gamma distribution and similar results also hold for the Weibull procedure under a generalized gamma assumption. These results made a Monte Carlo study of this problem feasible. Since the maximum likelihood estimators are apparently ill-behaved for smaller sample sizes and since the Weibull model is robust it appears little is gained by using the generalized gamma distribution for samples of size less than 200 to 400.  相似文献   

4.
In shielding calculation, deterministic methods have some advantages and also some disadvantages relative to other kind of codes, such as Monte Carlo. The main advantage is the short computer time needed to find solutions while the disadvantages are related to the often-used build-up factor that is extrapolated from high to low energies or with unknown geometrical conditions, which can lead to significant errors in shielding results. The aim of this work is to investigate how good are some deterministic methods to calculating low-energy shielding, using attenuation coefficients and build-up factor corrections. Commercial software MicroShield 5.05 has been used as the deterministic code while MCNP has been used as the Monte Carlo code. Point and cylindrical sources with slab shield have been defined allowing comparison between the capability of both Monte Carlo and deterministic methods in a day-by-day shielding calculation using sensitivity analysis of significant parameters, such as energy and geometrical conditions.  相似文献   

5.
《Advanced Powder Technology》2020,31(4):1457-1469
Breakage, i.e., formation of smaller fragments from larger initial particles, is an important phenomenon – wanted or unwanted – in many particulate processes. In this work, the volume and time dependent selection function for pure breakage process is modeled based on the population balance modeling approach. It is found that the selection function is directly correlated with the stressing frequency, probability of successful events, and some integral properties of the number density function. A discrete population balance equation is applied to compute the total number of particles and particle size distribution numerically. Moreover, an event-driven constant number Monte Carlo simulation algorithm is presented, and the simulation results are used as an alternative to experimental results. The volume dependency of the selection function is incorporated successfully in the Monte Carlo simulation while selecting particles for stressing event. Some important properties of any particulate process, such as the total number of particles and the size distribution of particles are validated successfully using the Monte Carlo results. This offers new insights into the estimation and interpretation of breakage kinetics.  相似文献   

6.
Probability distributions of the size of ion clusters created in 'nanometric' cylindrical volumes of nitrogen by single 4.6 MeV alpha particles were measured and compared with those calculated by Monte Carlo simulation. The diameter of the sensitive volume had a mass per area of between 0.015 and 1.3 micrograms.cm-2 which, for a material at unit density, corresponds to a diameter of between 0.15 nm and 13 nm. These nanometre sizes were simulated experimentally in a device called Jet Counter. The measured or calculated cluster size probabilities confirmed that the formation of ionisation clusters along a 'nanometre' track can be characterised by Poisson's distribution only for very small targets. The present ionisation cluster probabilities produced in 'nanometric' volumes, 2 to 10 nm in diameter, are the first ever determined experimentally and confirmed by Monte Carlo simulation.  相似文献   

7.
8.
The mathematical scheme of the biased direct Monte Carlo method is presented in detail and compared with the corresponding biased analogues Monte Carlo method. With reference to the inhomogeneous Poisson processes, an example in the reliability and safety engineering field is given. This refers to sampling from a three-mode Weibull distribution by forcing each of the three modes to be exponential with decay constants 10 times larger than the natural ones. Since the natural distribution under analysis shows a rather uniform behaviour over most of the range of interest, biasing to a uniform distribution has also been investigated. Both examples are shown to lead to a simplification in the sampling procedure and an improvement in the computational efficiency.  相似文献   

9.
The dielectric properties of nanograin ferroelectric lead titanate crystals are presented. The PbTiO3 samples were prepared by pressing nanopowders into plates and were studied experimentally by dielectric permittivity measurements in a wide frequency and temperature range. The TC dependence obtained showed a critical change of behavior with increasing mean nanoparticle size in the 9-nm region. The theoretical calculations based on Monte Carlo simulation were performed to describe the behavior of this material. It was shown that the distribution of nanoparticle sizes in the sample taken into account with the Monte Carlo method describes the dielectric properties of PbTiO3 nanocrystals quite well.  相似文献   

10.
A 3-D Direct Simulation Monte Carlo (DSMC) code has been developed to study the vapour characteristics of atomic species. This model needs to be verified with experimental results. Experiments have been conducted with free jet of copper using electron beam source and the flux distribution data have been obtained under different experimental conditions. For better statistics, the Direct Simulation Monte Carlo (DSMC) code was parallelised to run on parallel system. The experimental results from circular and a slit source (Kn varies from 0.47 to 0.083) have been compared with Direct Simulation Monte Carlo (DSMC) results. The results have shown excellent agreement. From the trend of the flux distribution curves, it is clear that with lower Kn, more uniform coating can be expected.  相似文献   

11.
Theoretical and experimental investigations were combined to characterize the influence of surface casting defects (shrinkages) on the high cycle fatigue (HCF) reliability. On fracture surfaces of fatigue samples, the defect is located at the surface. The shape used for the calculation is a spherical void with variable radius. Finite-element simulations were then performed to determine stress distribution around defects for different sizes and different loadings. Correlated expressions of the maximum hydrostatic stress and the amplitude of the shear stress were obtained by using the response surface technique. The loading representative point in the HCF criterion was then transformed into a scattering surface, which has been obtained by a random sampling of the defect sizes. The HCF reliability has been computed by using the Monte Carlo simulation method. Tension and torsion fatigue tests were conducted on nodular cast iron with quantification of defect size on the fracture surface. The S – N curves show a large fatigue life scattering; shrinkages are at the origin of the fatal crack leading to the final failure. The comparison of the computed HCF reliability to the experimental results shows a good agreement. The capability of the proposed model to take into account the influence of the range of the defect sizes and the type of its statistical distribution has been demonstrated. It is shown that the stress distribution at the fatigue limit is log-normal, which can be explained by the log-normal defect distribution in the nodular cast iron tested.  相似文献   

12.
Modeling of the static recrystallization in deformed copper specimens with different initial grain sizes is carried out based on a previous dislocation–grain size interaction model and a Monte Carlo simulation. From the dislocation–grain size interaction model, the stored energy of the deformed copper is calculated considering the interaction of the dislocations due to the different initial grain sizes. Then, utilizing the stored energy and Monte Carlo simulation the kinetic of recrystallization and recrystallized grain sizes are obtained. The JMAK plots of the modeling results show that, in conditions of 2D modeling and site-saturated nucleation, the Avrami exponent is 2 ± 0.1. The time for 50% recrystallization and recrystallized grain size increase by increasing the initial grain size at a specific strain and are consistent with the experimental data.  相似文献   

13.
An optimal experimental set-up maximizes the value of data for statistical inferences. The efficiency of strategies for finding optimal experimental set-ups is particularly important for experiments that are time-consuming or expensive to perform. In the situation when the experiments are modeled by partial differential equations (PDEs), multilevel methods have been proven to reduce the computational complexity of their single-level counterparts when estimating expected values. For a setting where PDEs can model experiments, we propose two multilevel methods for estimating a popular criterion known as the expected information gain (EIG) in Bayesian optimal experimental design. We propose a multilevel double loop Monte Carlo, which is a multilevel strategy with double loop Monte Carlo, and a multilevel double loop stochastic collocation, which performs a high-dimensional integration on sparse grids. For both methods, the Laplace approximation is used for importance sampling that significantly reduces the computational work of estimating inner expectations. The values of the method parameters are determined by minimizing the computational work, subject to satisfying the desired error tolerance. The efficiencies of the methods are demonstrated by estimating EIG for inference of the fiber orientation in composite laminate materials from an electrical impedance tomography experiment.  相似文献   

14.
An efficient method is proposed for the evaluation of theabsorption and the transport scattering coefficients from atime-resolved reflectance or transmittance distribution. Theprocedure is based on a library of Monte Carlo simulations and is fastenough to be used in a nonlinear fitting algorithm. Tests performedagainst both Monte Carlo simulations and experimental measurements ontissue phantoms show that the results are significantly better thanthose obtained by fitting the data with the diffusion approximation, especially for low values of the scattering coefficient. The methodrequires an a priori assumption on the value of theanisotropy factor g. Nonetheless, the transportscattering coefficient is rather independent of the exact knowledge ofthe g value within the range 0.7 < g < 0.9.  相似文献   

15.
Monte Carlo methods provide a powerful technique for estimating the average radiation flux in a volume (or across a surface) in cases where analytical solutions may not be possible. Unfortunately, Monte Carlo simulations typically provide only integral results and do not offer any further details about the distribution of the flux with respect to space, angle, time or energy. In the functional expansion tally (FET) a Monte Carlo simulation is used to estimate the functional expansion coefficients for flux distributions with respect to an orthogonal set of basis functions. The expansion coefficients are then used in post-processing to reconstruct a series approximation to the true distribution. Discrete event FET estimators are derived and their application in estimating radiation flux or current distributions is demonstrated. Sources of uncertainty in the FET are quantified and estimators for the statistical and truncation errors are derived. Numerical results are presented to support the theoretical development.  相似文献   

16.
对小批量生产下的质量控制问题进行研究,在方差已知时的贝叶斯均值控制模型中,推导了蒙特卡罗方法确定模型中质量特性参数后验分布的过程,并且与基于共轭先验分布的理论方法进行比较,通过实例发现,基于蒙特卡罗确定质量特性参数统计值的方法与理论的共轭先验分布方法能达到相同的控制效果,并且蒙特卡罗方法不需要假设质量特性参数的先验分布,在实际的生产中具有较好的普适性。  相似文献   

17.
The physical sources of randomness in quasibrittle fracture described by the cohesive crack model are discussed and theoretical arguments for the basic form of the probability distribution are presented. The probability distribution of the size effect on the nominal strength of structures made of heterogeneous quasibrittle materials is derived, under certain simplifying assumptions, from the nonlocal generalization of Weibull theory. Attention is limited to structures of positive geometry failing at the initiation of macroscopic crack growth from a zone of distributed cracking. It is shown that, for small structures, which do not dwarf the fracture process zone (FPZ), the mean size effect is deterministic, agreeing with the energetic size effect theory, which describes the size effect due to stress redistribution and the associated energy release caused by finite size of the FPZ formed before failure. Material randomness governs the statistical distribution of the nominal strength of structure and, for very large structure sizes, also the mean. The large-size and small-size asymptotic properties of size effect are determined, and the reasons for the existence of intermediate asymptotics are pointed out. Asymptotic matching is then used to obtain an approximate closed-form analytical expression for the probability distribution of failure load for any structure size. For large sizes, the probability distribution converges to the Weibull distribution for the weakest link model, and for small sizes, it converges to the Gaussian distribution justified by Daniels' fiber bundle model. Comparisons with experimental data on the size-dependence of the modulus of rupture of concrete and laminates are shown. Monte Carlo simulations with finite elements are the subject of ongoing studies by Pang at Northwestern University to be reported later.  相似文献   

18.
A generic version of the Lifshitz–Slyozov–Wagner (LSW) distribution is introduced. Using this generic form, a maximum likelihood (ML) estimator for the population average has been developed. The statistical properties of the estimates obtained by the ML method, as well as the conventional sample average method, have been assessed by Monte Carlo simulations for seven sample sizes ranging from 10 to 1000. Results showed that (i) both estimators yield practically unbiased results, (ii) the standard deviation of estimates obtained by the ML method is significantly less than that of the sample averages, (iii) the distribution of estimates is neither normal, lognormal nor 2-parameter Weibull. Percentage points of the distribution of estimates for both methods have been developed. The use of these points for calculating confidence limits for the population average of the LSW distribution is demonstrated by examples in this article.  相似文献   

19.
Nanocluster films are modeled as a network of tunnel junctions in which random voids have been introduced. The effects of network size and void distribution on electron transport are studied using Monte Carlo simulations of the Coulomb blockade mediated transport. The random void distributions of the model networks are systematically varied by randomly deleting junctions along each row and between the different rows. The nonlinearity and voltage threshold of the I-V curves are calculated for different void topologies and network sizes. Both the threshold voltage and nonlinearity are sensitive to lateral fluctuations of the conduction paths due to the random voids. The nonlinearity is found to be maximized for network aspect ratios of unity or larger and for particular network topologies. Both the nonlinearity and voltage threshold scale with network size. The behaviors seen in simulation are found to correspond well to Au nanocluster I-V measurements.  相似文献   

20.
A novel probabilistic method for the optimization of robust design problems is presented. The approach is based on an efficient variation of the Monte Carlo simulation method. By shifting most of the computational burden to outside of the optimization loop, optimum designs can be achieved efficiently and accurately. Furthermore by reweighting an initial set of samples the objective function and constraints become smooth functions of changes in the probability distribution of the parameters, rather than the stochastic functions obtained using a standard Monte Carlo method. The approach is demonstrated on a beam truss example, and the optimum designs are verified with regular Monte Carlo simulation. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号