首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Reliability optimization is an important and challenging topic both in engineering and industrial situations as its objective is to design a highly reliable system that operates more safely and efficiently under constraints. Redundancy allocation problem (RAP), as one of the most well‐known problems in reliability optimization, has been the subject of many studies over the past few decades. RAP aims to find the best structure and the optimal redundancy level for each subsystem. The main goal in RAP is to maximize the overall system reliability considering some constraints. In all the previous RAP studies, the reliability of the components is considered constant during the system's mission time. However, reliability is time‐dependent and needs to be considered and monitored during the system's lifetime. In this paper, the reliability of components is considered as a function of time, and the RAP is reformulated by introducing a new criterion called ‘mission design life’ defined as the integration of the system reliability function during the mission time. We propose an efficient algorithm for this problem and demonstrate its performance using two examples. Furthermore, we demonstrate the importance of the new approach using a benchmark problem in RAP. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

2.
提出了一种新的基于贝叶斯法的电子器件可靠性指标估计方法.即首先利用以往批次产品的样本试验数据求出先验分布,然后利用当前批次产品的样本试验数据求出后验信息,最后将先验分布和后验信息代入贝叶斯公式求出需要的估计值.运用这种方法对电子器件的失效率λ,平均无故障工作时间θ和可靠度R(t)进行了点估计和区间估计.最后用算例加以分析.这种方法增加了估计样本的容量,解决了电子器件试验样本少、可靠性指标估计时缺少数据这一问题.  相似文献   

3.
For costly and dangerous experiments, growing attention has been paid to the problem of the reliability analysis of zero‐failure data, with many new findings in world countries, especially in China. The existing reliability theory relies on the known lifetime distribution, such as the Weibull distribution and the gamma distribution. Thus, it is ineffective if the lifetime probability distribution is unknown. For this end, this article proposes the grey bootstrap method in the information poor theory for the reliability analysis of zero‐failure data under the condition of a known or unknown probability distribution of lifetime. The grey bootstrap method is able to generate many simulated zero‐failure data with the help of few zero‐failure data and to estimate the lifetime probability distribution by means of an empirical failure probability function defined in this article. The experimental investigation presents that the grey bootstrap method is effective in the reliability analysis only with the few zero‐failure data and without any prior information of the lifetime probability distribution. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

4.
In this study, we introduce reliability models for a device with two dependent failure processes: soft failure due to degradation and hard failure due to random shocks, by considering the declining hard failure threshold according to changes in degradation. Owing to the nature of degradation for complex devices such as microelectromechanical systems, a degraded system is more vulnerable to force and stress during operation. We address two different scenarios of the changing hard failure threshold due to changes in degradation. In Case 1, the initial hard failure threshold value reduces to a lower level as soon as the overall degradation reaches a critical value. In Case 2, the hard failure threshold decreases gradually and the amount of reduction is proportional to the change in degradation. A condition‐based maintenance model derived from a failure limit policy is presented to ensure that a device is functioning under a certain level of degradation. Finally, numerical examples are illustrated to explain the developed reliability and maintenance models, along with sensitivity analysis. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

5.
Product reliability is a very important issue for the competitive strategy of industries. In order to estimate a product's reliability, parametric inferential methods are required to evaluate survival test data, which happens to be a fairly expensive data source. Such costly information usually imposes additional compromises in the product development and new challenges to be overcome throughout the product's life cycle. However, manufacturers also keep field failure data for warranty and maintenance purposes, which can be a low‐cost data source for reliability estimation. Field‐failure data are very difficult to evaluate using parametric inferential methods due to their small and highly censored samples, quite often representing mixed modes of failure. In this paper a method for reliability estimation using field failure data is proposed. The proposal is based on the use of non‐parametric inferential methods, associated with resampling techniques to derive confidence intervals for the reliability estimates. Test results show the adequacy of the proposed method to calculate reliability estimates and their confidence interval for different populations, including cases with highly right‐censored failure data. The method is shown to be particularly useful when the sampling distribution is not known, which happens to be the case in a large number of practical reliability evaluations. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

6.
For a period of mission time, only zero‐failure data can be obtained for high‐quality long‐life products. In the case of zero‐failure data reliability assessment, the point estimates and confidence interval estimates of distribution parameters cannot be obtained simultaneously by the current reliability assessment models, and the credibility of the assessment results may be reduced if they are obtained at the same time. A new model is proposed for consistency problem in this paper. In the proposed model, the point estimates of reliability can be obtained by the lifetime probability distribution derived from matching distribution curve method, while the confidence interval estimates of reliability can be obtained by using new samples generated from the lifetime probability distribution according to parameter bootstrap method. By analyzing the zero‐failure data of the torque motors after real operation, the results show that the new model not only meets the requirements of reliability assessment but also improves the accuracy of reliability interval estimation.  相似文献   

7.
When dealing with practical problems of stress–strength reliability, one can work with fatigue life data and make use of the well‐known relation between stress and cycles until failure. For some materials, this kind of data can involve extremely large values. In this context, this paper discusses the problem of estimating the reliability index R = P(Y < X) for stress–strength reliability, where stress Y and strength X are independent q‐exponential random variables. This choice is based on the q‐exponential distribution's capability to model data with extremely large values. We develop the maximum likelihood estimator for the index R and analyze its behavior by means of simulated experiments. Moreover, confidence intervals are developed based on parametric and nonparametric bootstrap. The proposed approach is applied to two case studies involving experimental data: The first one is related to the analysis of high‐cycle fatigue of ductile cast iron, whereas the second one evaluates the specimen size effects on gigacycle fatigue properties of high‐strength steel. The adequacy of the q‐exponential distribution for both case studies and the point and interval estimates based on maximum likelihood estimator of the index R are provided. A comparison between the q‐exponential and both Weibull and exponential distributions shows that the q‐exponential distribution presents better results for fitting both stress and strength experimental data as well as for the estimated R index. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

8.
The traditional reliability analysis method based on probabilistic method requires probability distributions of all the uncertain parameters. However, in practical applications, the distributions of some parameters may not be precisely known due to the lack of sufficient sample data. The probabilistic theory cannot directly measure the reliability of structures with epistemic uncertainty, ie, subjective randomness and fuzziness. Hence, a hybrid reliability analysis (HRA) problem will be caused when the aleatory and epistemic uncertainties coexist in a structure. In this paper, by combining the probability theory and the uncertainty theory into a chance theory, a probability‐uncertainty hybrid model is established, and a new quantification method based on the uncertain random variables for the structural reliability is presented in order to simultaneously satisfy the duality of random variables and the subadditivity of uncertain variables; then, a reliability index is explored based on the chance expected value and variance. Besides, the formulas of the chance theory‐based reliability and reliability index are derived to uniformly assess the reliability of structures under the hybrid aleatory and epistemic uncertainties. The numerical experiments illustrate the validity of the proposed method, and the results of the proposed method can provide a more accurate assessment of the structural system under the mixed uncertainties than the ones obtained separately from the probability theory and the uncertainty theory.  相似文献   

9.
机电产品在服役期间因零件失效而产生故障,重组维修破坏了原有的系统可靠性模型,因而需要对设备可靠性问题重新进行研究和评价。基于机电系统中零件的失效时间分布密度函数,研究了在重组维护过程中机电系统服役期间零件年龄结构的分布规律,发展了机电系统可靠性数学模型。通过仿真研究,探讨了系统服役期间年龄结构、可靠度和失效率的发展规律,定量地研究了失效时间分布密度函数的参数对系统可靠度的影响。这对于评估机械系统的可靠性和全生命周期的失效率,制定合理的维修策略具有重要意义。  相似文献   

10.
The low elastic modulus and time‐consuming formation process represent the major challenges that impede the penetration of nanoparticle superstructures into daily life applications. As observed in the molecular or atomic crystals, more effective interactions between adjacent nanoparticles would introduce beneficial features to assemblies enabling optimized mechanical properties. Here, a straightforward synthetic strategy is showed that allows fast and scalable fabrication of 2D Ag‐mercaptoalkyl acid superclusters of either hexagonal or lamellar topology. Remarkably, these ordered superstructures exhibit a structure‐dependent elastic modulus which is subject to the tether length of straight‐chain mercaptoalkyl acids or the ratio between silver and tether molecules. These superclusters are plastic and moldable against arbitrarily shaped masters of macroscopic dimensions, thereby opening a wealth of possibilities to develop more nanocrystals with practically useful nanoscopic properties.  相似文献   

11.
In reliability allocation, certain reliability values are assigned to subsystems and components to achieve the required system reliability. One big challenge in solving such reliability‐based design problems is how to handle the uncertain preferences of a decision maker on multiple attributes of interest. In this paper, we propose a new ordered weighted averaging (OWA) method based on an analytic hierarchy process to address the decision maker's uncertain preferences in reliability allocation. In the proposed OWA operator, a bi‐objective mathematical programming model considering both maximal entropy and minimal variance is transformed into a single‐objective mathematical programming model using an ideal‐point method. The maximum entropy minimal variance OWA operator takes full advantage of available information and avoids overestimating the decision maker's preferences. A detailed computational procedure is presented to facilitate the implementation of the proposed method in practice. An illustrative example about the powertrain of fuel cell vehicles is provided to demonstrate the effectiveness of this method in handling multiple attributes with uncertain preferences in reliability allocation. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

12.
Usually, for high reliability products the production cost is high and the lifetime is much longer, which may not be observable within a limited time. In this paper, an accelerated experiment is employed in which the lifetime follows an exponential distribution with the failure rate being related to the accelerated factor exponentially. The underlying parameters are also assumed to have the exponential prior distributions. A Bayesian zero‐failure reliability demonstration test is conducted to design forehand the minimum sample size and testing length subject to a certain specified reliability criterion. Probability of passing the test design as well as predictive probability for additional experiments is also derived. Sensitivity analysis of the design is investigated by a simulation study. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

13.
14.
15.
Cytotoxicity of nanomaterials on living systems is known to be affected by their size, shape, surface chemistry, and other physicochemical properties. Exposure to a well‐characterized subpopulation of specific nanomaterials is therefore desired to reveal more detailed mechanisms. This study develops scalable density gradient ultracentrifugation sorting of highly dispersed single‐walled carbon nanotubes (SWNTs) into four distinct bands based on diameter, aggregation, and structural integrity, with greatly improved efficiency, yield, and reproducibility. With guarantee of high yield and stability of four SWNT fractions, it is possible for the first time, to investigate the structure‐dependent bioeffects of four SWNT fractions. it is possible Among these, singly‐dispersed integral SWNTs show no significant effects on the mitochondrial functions and hypoxia. The aggregated integral SWNTs show more significant effects on the mitochondrial dysfunction and hypoxia compared to the aggregated SWNTs with poor structure integrity. Then, it is found that the aggregated integral SWNTs induced the irregular mitochondria respiratory and pro‐apoptotic proteins activation, while aggregated SWNTs with poor structure integrity greatly enhanced reactive oxygen species (ROS) levels. This work supports the view that control of the distinct structure characteristics of SWNTs helps establish clearer structure‐bioeffect correlation and health risk assessment. It is also hoped that these results can help in the design of nanomaterials with higher efficiency and accuracy in subcellular translocation.  相似文献   

16.
可靠性预计与分配技术通过预测和分配动车组各子部件和子系统的可靠性,对整车可靠性进行定量评估,有助于在动车组设计前期准确把握用户需求,选择最优设计和维修方案。  相似文献   

17.
Recent observations of facet‐dependent electrical conductivity and photocatalytic activity of various semiconductor crystals are presented. Then, the discovery of facet‐dependent surface plasmon resonance absorption of metal–Cu2O core–shell nanocrystals with tunable sizes and shapes is discussed. The Cu2O shells also exhibit a facet‐specific optical absorption feature. The facet‐dependent electrical conductivity, photocatalytic activity, and optical properties are related phenomena, resulting from the presence of an ultrathin surface layer with different band structures and thus varying degrees of band bending for the {100}, {110}, and {111} faces of Cu2O to absorb light of somewhat different wavelengths. Recently, it is shown that the light absorption and photoluminescence properties of pure Cu2O cubes, octahedra, and rhombic dodecahedra also display size and facet effects because of their tunable band gaps. A modified band diagram of Cu2O can be constructed to incorporate these optical effects. Literature also provides examples of facet‐dependent optical behaviors of semiconductor nanostructures, indicating that optical properties of nanoscale semiconductor materials are intrinsically facet‐dependent. Some applications of semiconductor optical size and facet effects are considered.  相似文献   

18.
This paper analyzes the competing and dependent failure processes for multi-state systems suffering from four typical random shocks. Reliability analysis for discrete degradation is conducted by explicitly modeling the state transition characteristics. Semi-Markov model is employed to explore how the system vulnerability and potential transition gap affect the state residence time. The failure dependence is specified as that random shocks can not only lead to different abrupt failures but also cause sudden changes on the state transition probabilities, making it easier for the system to stay at the degraded states. Reliability functions for all the exposed failure processes are presented based on the corresponding mechanisms. Interactions between different failure processes are also taken into account to evaluate the actual reliability levels in the context of degradation and distinct random shocks. An illustrative example of a multi-state air conditioning system is studied to demonstrate how the proposed method can be applied to the engineering practice.  相似文献   

19.
Retro‐inverso bradykinin (RI‐BK) has better metabolic stability and higher affinity for the BK type 2 (B2) receptor, compared with bradykinin. At low doses, RI‐BK can selectively enhance the permeability of the blood–brain tumor barrier (BBTB) without harming normal brain tissue. In this study, gold nanoparticles (GNPs) of size ranging from 5 to 90 nm are synthesized to assess the optimal size of nanocarriers that achieves maximum brain accumulation after the treatment of RI‐BK. The ability of the GNPs to cross the BBTB is tested in a rat C6 glioma tumor model. The results of inductively coupled plasma–mass spectrometry and transmission electron microscopy indicate that GNPs with size of 70 nm achieve maximum permeability to the glioma. The present study supports the conclusion that RI‐BK can enhance the permeability of BBTB and provides fundamental information for further development of nanomedicines or nanoprobes for glioma therapy.  相似文献   

20.
The theory of network reliability has been applied to many complicated network structures, such as computer and communication networks, piping systems, electricity networks, and traffic networks. The theory is used to evaluate the operational performance of networks that can be modeled by probabilistic graphs. Although evaluating network reliability is an Non‐deterministic Polynomial‐time hard problem, numerous solutions have been proposed. However, most of them are based on sequential computing, which under‐utilizes the benefits of multi‐core processor architectures. This paper addresses this limitation by proposing an efficient strategy for calculating the two‐terminal (terminal‐pair) reliability of a binary‐state network that uses parallel computing. Existing methods are analyzed. Then, an efficient method for calculating terminal‐pair reliability based on logical‐probabilistic calculus is proposed. Finally, a parallel version of the proposed algorithm is developed. This is the first study to implement an algorithm for estimating terminal‐pair reliability in parallel on multi‐core processor architectures. The experimental results show that the proposed algorithm and its parallel version outperform an existing sequential algorithm in terms of execution time. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号