首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 42 毫秒
1.
The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur Dempster's multivalued mappings of probability spaces. This leads to random set and more generally to random filter models of evidence. In this probabilistic view evidence is seen as more or less probable arguments for certain hypotheses and they can be used to support those hypotheses to certain degrees. These degrees of support are in fact the reliabilities with which the hypotheses can be derived from the evidence. Alternatively, the mathematical theory of evidence can be founded axiomatically on the notion of belief functions or on the allocation of belief masses to subsets of a frame of discernment. These approaches aim to present evidence theory as an extension of probability theory. Evidence theory has been used to represent uncertainty in expert systems, especially in the domain of diagnostics. It can be applied to decision analysis and it gives a new perspective for statistical analysis. Among its further applications are image processing, project planning and scheduling and risk analysis. The computational problems of evidence theory are well understood and even though the problem is complex, efficient methods are available.Research partly supported by Grants No. 21-30186.90 and 21-32660.91 of the Swiss National Foundation for Scientific Research.  相似文献   

2.
This paper surveys the use of fuzzy methods in commercial applications of the technology of expert systems and in commercial products which aim to support such applications. It attempts to evaluate the utility of such an approach to uncertainty management compared to other well known methods of handling uncertainty in expert systems. Starting from this base it attempts an evaluation of the prospects for fuzzy expert systems in the medium term. As a survey it attempts to list applications and commercial products as comprehensively as is practical and includes an extensive bibliography on the topic.  相似文献   

3.
Partially consonant belief functions (pcb), studied by Walley, are the only class of Dempster-Shafer belief functions that are consistent with the likelihood principle of statistics. Structurally, the set of foci of a pcb is partitioned into non-overlapping groups and within each group, foci are nested. The pcb class includes both probability function and Zadeh’s possibility function as special cases. This paper studies decision making under uncertainty described by pcb. We prove a representation theorem for preference relation over pcb lotteries to satisfy an axiomatic system that is similar in spirit to von Neumann and Morgenstern’s axioms of the linear utility theory. The closed-form expression of utility of a pcb lottery is a combination of linear utility for probabilistic lottery and two-component (binary) utility for possibilistic lottery. In our model, the uncertainty information, risk attitude and ambiguity attitude are separately represented. A tractable technique to extract ambiguity attitude from a decision maker behavior is also discussed.  相似文献   

4.
Abstract

Belief networks provide an important bridge between statistical modeling and expert systems. This article presents methods for visualizing probabilistic “evidence flows” in belief networks, thereby enabling belief networks to explain their behavior. Building on earlier research on explanation in expert systems, we present a hierarchy of explanations, ranging from simple colorings to detailed displays. Our approach complements parallel work on textual explanations in belief networks. Graphical-Belief, Mathsoft Inc.'s belief network software, implements the methods.  相似文献   

5.
This paper addresses the problem of exchanging uncertainty assessments in multi-agent systems. Since it is assumed that each agent might completely ignore the internal representation of its partners, a common interchange format is needed. We analyze the case of an interchange format defined by means of imprecise probabilities, pointing out the reasons of this choice. A core problem with the interchange format concerns transformations from imprecise probabilities into other formalisms (in particular, precise probabilities, possibilities, belief functions). We discuss this so far little investigated question, analyzing how previous proposals, mostly regarding special instances of imprecise probabilities, would fit into this problem. We then propose some general transformation procedures, which take also account of the fact that information can be partial, i.e. may concern an arbitrary (finite) set of events.  相似文献   

6.
An integrated approach to truth-gaps and epistemic uncertainty is described, based on probability distributions defined over a set of three-valued truth models. This combines the explicit representation of borderline cases with both semantic and stochastic uncertainty, in order to define measures of subjective belief in vague propositions. Within this framework we investigate bridges between probability theory and fuzziness in a propositional logic setting. In particular, when the underlying truth model is from Kleene's three-valued logic then we provide a complete characterisation of compositional min–max fuzzy truth degrees. For classical and supervaluationist truth models we find partial bridges, with min and max combination rules only recoverable on a fragment of the language. Across all of these different types of truth valuations, min–max operators are resultant in those cases in which there is only uncertainty about the relative sharpness or vagueness of the interpretation of the language.  相似文献   

7.
A method for combining two types of judgments about an object analyzed, which are elicited from experts, is considered in the paper. It is assumed that the probability distribution of a random variable is known, but its parameters may be determined by experts. The method is based on the use of the imprecise probability theory and allows us to take into account the quality of expert judgments, heterogeneity and imprecision of information supplied by experts. An approach for computing “cautious” expert beliefs under condition that the experts are unknown is studied. Numerical examples illustrate the proposed method.  相似文献   

8.
在经济管理科学中,单纯依靠数据支持,通过建模进行仿真和优化计算,以寻求合理的决策方案的数量化研究方法,面对许多非结构化或半结构化的,并含有大量复杂的不确定性因素的社会经济问题,暴露出严重的不足.需要依靠有关领域专家的知识、经验和智慧来解决问题.发展知识工程和带知识库的决策支持系统,成为信息——决策科学的前沿领域.  相似文献   

9.
This paper extends the theory of belief functions by introducing new concepts and techniques, allowing to model the situation in which the beliefs held by a rational agent may only be expressed (or are only known) with some imprecision. Central to our approach is the concept of interval-valued belief structure (IBS), defined as a set of belief structures verifying certain constraints. Starting from this definition, many other concepts of Evidence Theory (including belief and plausibility functions, pignistic probabilities, combination rules and uncertainty measures) are generalized to cope with imprecision in the belief numbers attached to each hypothesis. An application of this new framework to the classification of patterns with partially known feature values is demonstrated.  相似文献   

10.
An integral representation theorem for outer continuous and inner regular belief measures on compact topological spaces is elaborated under the condition that compact sets are countable intersections of open sets (e.g. metric compact spaces). Extreme points of this set of belief measures are identified with unanimity games with compact support. Then, the Choquet integral of a real valued continuous function can be expressed as a minimum of means over the sigma-core and also as a mean of minima over the compact subsets. Similarly, for bounded measurable functions, the Choquet integral is expressed as min of means over the core, we prove in addition that it is a mean of infima over the compact subsets. Then, we obtain Choquet–Revuz' measure representation theorem and introduce the Möbius transform of a belief measure. An extension to locally compact and sigma-compact topological spaces is provided.  相似文献   

11.
We are interested in the problem of multi-source information fusion in the case when the information provided has some uncertainty. We note that sensor provided information generally has a probabilistic type of uncertainty whereas linguistic information typically introduces a possibilistic type of uncertainty. More generally, we are faced with a problem in which we must fuse information with different types of uncertainty. In order to provide a unified framework for the representation of these different types of uncertain information we use a set measure approach for the representation of uncertain information. We discuss a set measure representation of uncertain information. In the multi-source fusion problem, in addition to having a collection of pieces of information that must be fused, we need to have some expert provided instructions on how to fuse these pieces of information. Generally these instructions can involve a combination of linguistically and mathematically expressed directions. In the course of this work we begin to consider the fundamental task of how to translate these instructions into formal operations that can be applied to our information. This requires us to investigate the important problem of the aggregation of set measures.  相似文献   

12.
在专家系统中,由于证据的不确定性和推理规则的不确定性,推理也相应地发生变化,需要对证据进行合成、传播与修正。有许多文献进行了研究[2],[3],[4],但已有的方法大都是针对不同的不确定性推理给出不同的方法。“本文旨在给出不确定性推理中证据合成、传播与修正的一般公式。  相似文献   

13.
Attribute reduction is viewed as an important issue in data mining and knowledge representation. This paper studies attribute reduction in fuzzy decision systems based on generalized fuzzy evidence theory. The definitions of several kinds of attribute reducts are introduced. The relationships among these reducts are then investigated. In a fuzzy decision system, it is proved that the concepts of fuzzy positive region reduct, lower approximation reduct and generalized fuzzy belief reduct are all equivalent, the concepts of fuzzy upper approximation reduct and generalized fuzzy plausibility reduct are equivalent, and a generalized fuzzy plausibility consistent set must be a generalized fuzzy belief consistent set. In a consistent fuzzy decision system, an attribute set is a generalized fuzzy belief reduct if and only if it is a generalized fuzzy plausibility reduct. But in an inconsistent fuzzy decision system, a generalized fuzzy belief reduct is not a generalized fuzzy plausibility reduct in general.  相似文献   

14.
Whilst supported by compelling arguments, the representation of uncertainty by means of (subjective) probability does not enjoy a unanimous consensus. A substantial part of the relevant criticisms point to its alleged inadequacy for representing ignorance as opposed to uncertainty. The purpose of this paper is to show how a strong justification for taking belief as probability, namely the Dutch Book argument, can be extended naturally so as to provide a logical characterization of coherence for imprecise probability, a framework which is widely believed to accommodate some fundamental features of reasoning under ignorance. The appropriate logic for our purposes is an algebraizable logic whose equivalent algebraic semantics is a variety of MV-algebras with an additional internal unary operation representing upper probability (these algebras will be called UMV-algebras).  相似文献   

15.
包含度理论   总被引:22,自引:6,他引:16  
不确定性是复杂系统的特征。在人工智能与专家系统中,不确定性的研究越来越具有重要意义。Zadeh[1]于1965年提出模糊集,把经典集合扩充到模糊集合,从而解决了“对象”的不确定性。本文引进了包含度的概念,解决了“关系”的不确定性。同时指出,包含度不仅是各种不确定性推理方法的概括,而且解决了知识获取与矛盾规则排除两个重要问题。  相似文献   

16.
Harsanyi (1967–68) proposed a method for transforming uncertainty over the strategy sets of players into uncertainty over their payoffs. The transformation appears to rely on an assumption that the players are rational, or, indeed, that they are rational and that there is common belief of rationality. Such an assumption would be awkward from the perspective of the epistemic program, which is often interested in the implications of irrationality or a lack of common belief of rationality. This paper shows that without common belief of rationality, such implications are not necessarily maintained under a Harsanyi transformation. The paper then shows how, with the belief-system model of Aumann and Brandenburger (1995), such implications can be maintained in the absence of common belief of rationality. Received: December 2000/Revised: February 2002  相似文献   

17.
It is appropriate to use Dempster's rule for combining belief functions only if the belief functions combined are based on independent items of evidence. What can be done in the case of dependent evidence? Often the answer is to reframe the problem. Three examples are given: one from everyday experience, one from probabilistic relaxation, and one from expert systems.  相似文献   

18.
This paper proposes solution approaches to the belief linear programming (BLP). The BLP problem is an uncertain linear program where uncertainty is expressed by belief functions. The theory of belief function provides an uncertainty measure that takes into account the ignorance about the occurrence of single states of nature. This is the case of many decision situations as in medical diagnosis, mechanical design optimization and investigation problems. We extend stochastic programming approaches, namely the chance constrained approach and the recourse approach to obtain a certainty equivalent program. A generic solution strategy for the resulting certainty equivalent is presented.  相似文献   

19.
An appropriate and accurate residual life prediction for an asset is essential for cost effective and timely maintenance planning and scheduling. The paper reports the use of expert judgments as the additional information to predict a regularly monitored asset’s residual life. The expert judgment is made on the basis of measured condition monitoring parameters, and is treated as a random variable, which may be described by a probability distribution due to the uncertainty involved. Since most expert judgments are in the form of a set of integer numbers, we can either directly use a discrete distribution or use a continuous distribution after some transformation. A key concept used in this paper is condition residual life where the residual life at the point of checking is conditional on, among others, the past expert judgments made on the same asset to date. Stochastic filtering theory is used to predict the residual life given available expert judgments. Artificial, simulated and real data are used for validating and testing the model developed.  相似文献   

20.
In this paper ordinary stochastic differential equations whose coefficients depend on uncertain parameters are considered. An approach is presented how to combine both types of uncertainty (stochastic excitation and parameter uncertainty) leading to set-valued stochastic processes. The latter serve as a robust representation of solutions of the underlying stochastic differential equations. The mathematical concept is applied to a problem from earthquake engineering, where it is shown how the efficiency of Tuned Mass Dampers can be realistically assessed in the presence of uncertainty. (© 2011 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号