首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 65 毫秒
1.
For real engineering systems, it is sometimes difficult to obtain sufficient data to estimate the precise values of some parameters in reliability analysis. This kind of uncertainty is called epistemic uncertainty. Because of the epistemic uncertainty, traditional universal generating function (UGF) technique is not appropriate to analyze the reliability of systems with performance sharing mechanism under epistemic uncertainty. This paper proposes a belief UGF (BUGF)‐based method to evaluate the reliability of multi‐state series systems with performance sharing mechanism under epistemic uncertainty. The proposed BUGF‐based reliability analysis method is validated by an illustrative example and compared with the interval UGF (IUGF)‐based methods with interval arithmetic or affine arithmetic. The illustrative example shows that the proposed BUGF‐based method is more efficient than the IUGF‐based methods in the reliability analysis of multi‐state systems (MSSs) with performance sharing mechanism under epistemic uncertainty.  相似文献   

2.
3.
4.
The purpose of this article is to develop an effective method to evaluate the reliability of structures with epistemic uncertainty so as to improve the applicability of evidence theory in practical engineering problems. The main contribution of this article is to establish an approximate semianalytic algorithm, which replaces the process of solving the extreme value of performance function and greatly improve the efficiency of solving the belief measure and the plausibility measure. First, the performance function is decomposed as a combination of a series of univariate functions. Second, each univariate function is approximated as a unary quadratic function by the second-order Taylor expansion. Finally, based on the property of the unary quadratic function, the maximum and minimum values of each univariate function are solved, and then the maximum and minimum values of performance function are obtained according to the monotonic relationship between each univariate function and their combination. As long as the first- and second-order partial derivatives of the performance function with respect to each input variable are obtained, the belief measure and plausibility measure of the structure can be estimated effectively without any additional computational cost. Two numerical examples and one engineering application are investigated to demonstrate the accuracy and efficiency of the proposed method.  相似文献   

5.
In this paper, we advanced a new interval reliability analysis model for fracture reliability analysis. Based on the non‐probabilistic stress intensity factor interference model and the ratio of the volume of the safe region to the total volume of the region associated with the variation of the standardized interval variables is suggested as the measure of structural non‐probabilistic reliability. We use this theory to calculate the reliability of structure based on fracture criterion. This model needs less uncertain information, so it has less limitation for analysing an uncertain structure or system. Examples of practical application are given to explain the simplicity and practicability of this model by comparing the interval reliability analysis model with probabilistic reliability analysis model.  相似文献   

6.
To support effective decision making, engineers should comprehend and manage various uncertainties throughout the design process. Unfortunately, in today's modern systems, uncertainty analysis can become cumbersome and computationally intractable for one individual or group to manage. This is particularly true for systems comprised of a large number of components. In many cases, these components may be developed by different groups and even run on different computational platforms. This paper proposes an approach for decomposing the uncertainty analysis task among the various components comprising a feed‐forward system and synthesizing the local uncertainty analyses into a system uncertainty analysis. Our proposed decomposition‐based multicomponent uncertainty analysis approach is shown to be provably convergent in distribution under certain conditions. The proposed method is illustrated on quantification of uncertainty for a multidisciplinary gas turbine system and is compared to a traditional system‐level Monte Carlo uncertainty analysis approach. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

7.
基于不确定度的精密测量结果的处理   总被引:1,自引:0,他引:1  
杨浪萍  张峰 《工业计量》2002,12(4):46-48
从正确处理精密测量结果的角度出发,通过精密测量实例,分析了采用测量不确定度处理精密测量结果的方法,讨论了采用测量不确定度处理精密测量结果的意义。  相似文献   

8.
In this article, the authors present a general methodology for age‐dependent reliability analysis of degrading or ageing components, structures and systems. The methodology is based on Bayesian methods and inference—its ability to incorporate prior information and on ideas that ageing can be thought of as age‐dependent change of beliefs about reliability parameters (mainly failure rate), when change of belief occurs not only because new failure data or other information becomes available with time but also because it continuously changes due to the flow of time and the evolution of beliefs. The main objective of this article is to present a clear way of how practitioners can apply Bayesian methods to deal with risk and reliability analysis considering ageing phenomena. The methodology describes step‐by‐step failure rate analysis of ageing components: from the Bayesian model building to its verification and generalization with Bayesian model averaging, which as the authors suggest in this article, could serve as an alternative for various goodness‐of‐fit assessment tools and as a universal tool to cope with various sources of uncertainty. The proposed methodology is able to deal with sparse and rare failure events, as is the case in electrical components, piping systems and various other systems with high reliability. In a case study of electrical instrumentation and control components, the proposed methodology was applied to analyse age‐dependent failure rates together with the treatment of uncertainty due to age‐dependent model selection. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

9.
In optimization under uncertainty for engineering design, the behavior of the system outputs due to uncertain inputs needs to be quantified at each optimization iteration, but this can be computationally expensive. Multifidelity techniques can significantly reduce the computational cost of Monte Carlo sampling methods for quantifying the effect of uncertain inputs, but existing multifidelity techniques in this context apply only to Monte Carlo estimators that can be expressed as a sample average, such as estimators of statistical moments. Information reuse is a particular multifidelity method that treats previous optimization iterations as lower fidelity models. This work generalizes information reuse to be applicable to quantities whose estimators are not sample averages. The extension makes use of bootstrapping to estimate the error of estimators and the covariance between estimators at different fidelities. Specifically, the horsetail matching metric and quantile function are considered as quantities whose estimators are not sample averages. In an optimization under uncertainty for an acoustic horn design problem, generalized information reuse demonstrated computational savings of over 60% compared with regular Monte Carlo sampling.  相似文献   

10.
First‐order reliability method (FORM) has been mostly utilized for solving reliability‐based design optimization (RBDO) problems efficiently. However, second‐order reliability method (SORM) is required in order to estimate a probability of failure accurately in highly nonlinear performance functions. Despite accuracy of SORM, its application to RBDO is quite challenging due to unaffordable numerical burden incurred by a Hessian calculation. For reducing the numerical efforts, a quasi‐Newton approach to approximate the Hessian is introduced in this study instead of calculating the true Hessian. The proposed SORM with the approximated Hessian requires computations only used in FORM, leading to very efficient and accurate reliability analysis. The proposed SORM also utilizes a generalized chi‐squared distribution in order to achieve better accuracy. Furthermore, SORM‐based inverse reliability method is proposed in this study. An accurate reliability index corresponding to a target probability of failure is updated using the proposed SORM. Two approaches in terms of finding an accurate most probable point using the updated reliability index are proposed. The proposed SORM‐based inverse analysis is then extended to RBDO in order to obtain a reliability‐based optimum design satisfying probabilistic constraints more accurately even for a highly nonlinear system. The numerical study results show that the proposed reliability analysis and RBDO achieve efficiency of FORM and accuracy of SORM at the same time. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

11.
提出了一种基于区间分析的不确定性有限元模型修正方法。在区间参数结构特征值分析理论和确定性有限元模型修正方法基础上,假设不确定性与初始有限元模型误差均较小,采用灵敏度方法推导了待修正参数区间中点值和不确定区间的迭代格式。以三自由度弹簧-质量系统和复合材料板为例,采用拉丁超立方抽样构造仿真试验模态参数样本,开展仿真研究。结果表明,当仿真试验样本能准确反映结构模态参数的区间特性时,方法的收敛精度和效率均较高;修正后计算模态参数能准确反映试验数据的区间特性。所提出方法适用于解决试验样本较少,仅能得到试验模态参数区间的有限元模型修正问题。  相似文献   

12.
Epistemic and aleatory uncertain variables always exist in multidisciplinary system simultaneously and can be modeled by probability and evidence theories, respectively. The propagation of uncertainty through coupled subsystem and the strong nonlinearity of the multidisciplinary system make the reliability analysis difficult and computational cost expensive. In this paper, a novel reliability analysis procedure is proposed for multidisciplinary system with epistemic and aleatory uncertain variables. First, the probability density function of the aleatory variables is assumed piecewise uniform distribution based on Bayes method, and approximate most probability point is solved by equivalent normalization method. Then, important sampling method is used to calculate failure probability and its variance and variation coefficient. The effectiveness of the procedure is demonstrated by two numerical examples. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
A path‐following non‐linear elastic analysis for structures composed of assemblages of flat slender elastic panels is presented. The proposed path‐following method employs FEM technology and a kinematical model to analyse these structures using a Koiter asymptotic approach. As a result it is possible to verify the accuracy achieved by the asymptotic method. The proposed mixed path‐following formulation is both efficient and robust with regards to the locking extrapolation phenomenon that strongly affects compatible formulations. The use of an HC finite element makes it possible to avoid the problem of the finite rotations in the space, maintaining a high degree of continuity and making the numeric formulation simple and efficient. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

14.
mode and effects analysis (FMEA) is an effective tool to assess the risk of a system or process under uncertain environment. However, how to handle the uncertainty in the subjective assessment is an open issue. In this paper, a novel method to deal with the uncertainty coming from subjective assessments of FMEA experts is proposed in the framework of Dempster–Shafer evidence theory. First, the uncertain degree of the assessment is measured by the ambiguity measure. Then, the uncertainty is transformed to the reliability of each FMEA expert and the relative importance of each risk factor. After that, the assessments from FMEA team will be fused with a discounting-based combination rule to address the potential conflict. Moreover, to avoid the situation that different risk priorities of failure modes may have the same ranking based on classical risk priority number method, the gray relational projection method (GRPM) is adopted for ranking risk priorities of failure modes. Finally, an application of the improved FMEA model in sheet steel production process verifies the reliability and validity of the proposed method.  相似文献   

15.
When dealing with practical problems of stress–strength reliability, one can work with fatigue life data and make use of the well‐known relation between stress and cycles until failure. For some materials, this kind of data can involve extremely large values. In this context, this paper discusses the problem of estimating the reliability index R = P(Y < X) for stress–strength reliability, where stress Y and strength X are independent q‐exponential random variables. This choice is based on the q‐exponential distribution's capability to model data with extremely large values. We develop the maximum likelihood estimator for the index R and analyze its behavior by means of simulated experiments. Moreover, confidence intervals are developed based on parametric and nonparametric bootstrap. The proposed approach is applied to two case studies involving experimental data: The first one is related to the analysis of high‐cycle fatigue of ductile cast iron, whereas the second one evaluates the specimen size effects on gigacycle fatigue properties of high‐strength steel. The adequacy of the q‐exponential distribution for both case studies and the point and interval estimates based on maximum likelihood estimator of the index R are provided. A comparison between the q‐exponential and both Weibull and exponential distributions shows that the q‐exponential distribution presents better results for fitting both stress and strength experimental data as well as for the estimated R index. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

16.
The formulation of a probability‐stress‐life (P‐S‐N) curve is a necessary step beyond the basic S‐N relation when dealing with reliability. This paper presents a model, relevant to materials that exhibits a fatigue limit, which considers the number of cycles to failure and the occurrence of the failure itself as statistically independent events, described with different distributions and/or different degree of scatter. Combining these two as a parallel system leads to the proposed model. In the case where the S‐N relation is a Basquin's law, the formulations of the probability density function, cumulative distribution function, quantiles, parameter and quantile confidence interval are presented in a procedure that allows practically any testing strategy. The result is a flexible model combined with the tools that deliver a wide range of information needed in the design phase. Finally, an extension to include static strength and applicability to fatigue crack growth and defects‐based fatigue approach are presented.  相似文献   

17.
An analysis has been made of the mathematical relationship between two alternative models for reliability and risk estimation under the assumption of mutual independence. In cases where the reliability formulation is expressible as a compound union event, the resultant reliability expressions are analogous to the Bernoulli and Poisson trials processes. Nonparametric inequality relationships aredeveloped that demonstrate that a Bayesian-Bernoulli model always predicts event probabilities that are less than Bernoulli probabilities, which are always less than or equal to probabilities predicted by the finer grained Poisson trials model. An analysis of the maximum relative prediction error indicates when the individual probabilities are less than 0.1, the relative error between the Bernoulli and Poisson models is always less than 5 percent. The results are demonstrated to have utility in system reliability, engineered design lifetime risk analysis, and simulation applications in which the model is based on independent trials.  相似文献   

18.
Saving of computer processing time on the reliability analysis of laminated composite structures using artificial neural networks is the main objective of this work. This subject is particularly important when the reliability index is a constraint in the optimization of structural performance, because the task of looking for an optimum structural design demands also a very high processing time. Reliability methods, such as Standard Monte Carlo (SMC), Monte Carlo with Importance Sampling (MC–IS), First Order Reliability Method (FORM) and FORM with Multiple Check Points (FORM–MCPs) are used to compare the solution and the processing time when the Finite Element Method (FEM) is employed and when the finite element analysis (FEA) is substituted by trained artificial neural networks (ANNs). Two ANN are used here: the Multilayer Perceptron Network (MPN) and the Radial Basis Network (RBN). Several examples are presented, including a shell with geometrically non-linear behavior, which shows the advantages using this methodology.  相似文献   

19.
Reliability analysis with both aleatory and epistemic uncertainties is investigated in this paper. The aleatory uncertainties are described with random variables, and epistemic uncertainties are tackled with evidence theory. To estimate the bounds of failure probability, several methods have been proposed. However, the existing methods suffer the dimensionality challenge of epistemic variables. To get rid of this challenge, a so‐called random‐set based Monte Carlo simulation (RS‐MCS) method derived from the theory of random sets is offered. Nevertheless, RS‐MCS is also computational expensive. So an active learning Kriging (ALK) model that only rightly predicts the sign of performance function is introduced and closely integrated with RS‐MCS. The proposed method is termed as ALK‐RS‐MCS. ALK‐RS‐MCS accurately predicts the bounds of failure probability using as few function calls as possible. Moreover, in ALK‐RS‐MCS, an optimization method based on Karush–Kuhn–Tucker conditions is proposed to make the estimation of failure probability interval more efficient based on the Kriging model. The efficiency and accuracy of the proposed approach are demonstrated with four examples. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

20.
This article introduces a method which combines the collaborative optimization framework and the inverse reliability strategy to assess the uncertainty encountered in the multidisciplinary design process. This method conducts the sub-system analysis and optimization concurrently and then improves the process of searching for the most probable point (MPP). It reduces the load of the system-level optimizer significantly. This advantage is specifically more prominent for large-scale engineering system design. Meanwhile, because the disciplinary analyses are treated as the equality constraints in the disciplinary optimization, the computation load can be further reduced. Examples are used to illustrate the accuracy and efficiency of the proposed method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号