首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
In this paper, we propose a two-layer sensor fusion scheme for multiple hypotheses multisensor systems. To reflect reality in decision making, uncertain decision regions are introduced in the hypotheses testing process. The entire decision space is partitioned into distinct regions of correct, uncertain and incorrect regions. The first layer of decision is made by each sensor indepedently based on a set of optimal decision rules. The fusion process is performed by treating the fusion center as an additional virtual sensor to the system. This virtual sensor makes decision based on the decisions reached by the set of sensors in the system. The optimal decision rules are derived by minimizing the Bayes risk function. As a consequence, the performance of the system as well as individual sensors can be quantified by the probabilities of correct, incorrect and uncertain decisions. Numerical examples of three hypotheses, two and four sensor systems are presented to illustrate the proposed scheme.  相似文献   

2.
Summary We propose and compare two induction principles called always and sometime for proving inevitability properties of programs. They are respective formalizations and generalizations of Floyd invariant assertions and Burstall intermittent assertions methods for proving total correctness of sequential programs whose methodological advantages or disadvantages have been discussed in a number of previous papers. Both principles are formalized in the abstract setting of arbitrary nondeterministic transition systems and illustrated by appropriate examples. The sometime method is interpreted as a recursive application of the always method. Hence always can be considered as a special case of sometime. These proof methods are strongly equivalent in the sense that a proof by one induction principle can be rewritten into a proof by the other one. The first two theorems of the paper show that an invariant for the always method can be translated into an invariant for the sometime method even if every recursive application of the later is required to be of finite length. The third and main theorem of the paper shows how to translate an invariant for the sometime method into an invariant for the always method. It is emphasized that this translation technique follows the idea of transforming recursive programs into iterative ones. Of course, a general translation technique does not imply that the original sometime invariant and the resulting always invariant are equally understandable. This is illustrated by an example.  相似文献   

3.
This paper compares two interactive interfaces, Dress and Ange, designed to facilitate an experiential address of the user or viewers relationship to touch. Dress, a polypropylene dress fitted with small counters, which offer glimpses of human flesh for sale, is a shop that sells the possibility to touch human skin. The sales-person wears this body-shop and wanders through public domains inviting people to pull on a pearl of their choice and, thereby, expose a parcel of skin that they are to caress gently, momentarily with their fingertips. The second device, Ange, consists of a transparent corset with metal rib-like protuberances. Through the strategic use of flex sensors, these rib-keys act as points of actuation and volume control levers for corresponding sound samples. Ange has been designed to be exhibited, performed or worn in the public domain in a similar way to Dress. The inspiration and consequent design of both Dress and Ange will be discussed and compared, as will their public exhibition and performance in order to demonstrate the effectiveness of these design solutions. In conclusion, the value of particular design strategies to coerce or seduce a viewer to address highly personal issues, will be raised.  相似文献   

4.
In this paper, we define what we call a unitary immersion of a nonlinear system. We observe that, for classical Hamiltonian systems, this notion contains, in some sense, the concept of quantization. We restrict our attention to degree-zero unitary immersions, where all observation functions must be represented by operators of the type multiplication by a function. We show that the problem of classifying such degree-zero unitary immersions of a given nonlinear system is not obvious. In some cases, we solve this problem.Chargé de Recherche au CNRS.Maître de Conférences.  相似文献   

5.
The design of the database is crucial to the process of designing almost any Information System (IS) and involves two clearly identifiable key concepts: schema and data model, the latter allowing us to define the former. Nevertheless, the term model is commonly applied indistinctly to both, the confusion arising from the fact that in Software Engineering (SE), unlike in formal or empirical sciences, the notion of model has a double meaning of which we are not always aware. If we take our idea of model directly from empirical sciences, then the schema of a database would actually be a model, whereas the data model would be a set of tools allowing us to define such a schema.The present paper discusses the meaning of model in the area of Software Engineering from a philosophical point of view, an important topic for the confusion arising directly affects other debates where model is a key concept. We would also suggest that the need for a philosophical discussion on the concept of data model is a further argument in favour of institutionalizing a new area of knowledge, which could be called: Philosophy of Engineering.  相似文献   

6.
In this paper I consider how the computer can or should be accepted in Japanese schools. The concept of teaching in Japan stresses learning from a long-term perspective. Whereas in the instructional technology, on which the CAI or the Tutoring System depends, step-by-step attainments in relatively short time are emphasized. The former is reluctant in using the computer, but both share the Platonic perspective which are goal-oriented. However, The Socratic teacher, who intends to activate students' innate disposition to be better, would find another way of teaching and use of the computer.  相似文献   

7.
Harnad's proposed robotic upgrade of Turing's Test (TT), from a test of linguistic capacity alone to a Total Turing Test (TTT) of linguisticand sensorimotor capacity, conflicts with his claim that no behavioral test provides even probable warrant for attributions of thought because there is no evidence of consciousness besides private experience. Intuitive, scientific, and philosophical considerations Harnad offers in favor of his proposed upgrade are unconvincing. I agree with Harnad that distinguishing real from as if thought on the basis of (presence or lack of) consciousness (thus rejecting Turing (behavioral) testing as sufficient warrant for mental attribution)has the skeptical consequence Harnad accepts — there is in factno evidence for me that anyone else but me has a mind. I disagree with hisacceptance of it! It would be better to give up the neo-Cartesian faith in private conscious experience underlying Harnad's allegiance to Searle's controversial Chinese Room Experiment than give up all claim to know others think. It would be better to allow that (passing) Turing's Test evidences — evenstrongly evidences — thought.  相似文献   

8.
In many applications one has a set of discrete points at which some variable such as pressure or velocity is measured. In order to graphically represent and display such data (say, as contours of constant pressure), the discrete data must be represented by a smooth function. This continuous surface can then be evaluated at any point for graphical display. Sometimes data are arbitrarily located except that they occur along non-intersecting lines, an example occurring in wind tunnel tests where data are recorded at plug taps on an aircraft body. An algorithm is developed for this type of structured data problem and illustrated by means of color computer graphics.  相似文献   

9.
Summary A formal functional specification of a serializable interface for an interactive database is given and refined into two different versions with distinct strategies for solving read/write conflicts. The formalization is based on techniques of algebraic specification for defining the basic data structures and functional system specification by streams and stream processing functions for defining the properties concerning interaction. It is especially demonstrated how different specification techniques can be used side by side. Manfred Broy finished his studies with the Diplom in Mathematics and Computer Science at the Technical University of Munich. Till 1983 he was research and teaching assistant at the Institut für Informatik and the Sonderforschungsbereich 49 Programmiertechnik. At the Technical University of Munich he also did his Ph.D. (in February 1980 with the subject: Transformation parallel ablaufender Programme) and qualified as an university lecturer (in 1982 with the subject: A Theory for Nondeterminism, Parallelism, Communication and Concurrency). In April 1983 he became a Full Professor for Computer Science at the Faculty of Mathematics and Computer Science at the University of Passau. Since October 1989 he has been Full Professor for Computer Science at the Technical University of Munich. His fields of interests are: Programming languages, program development, programming methodology and distributed systems.This work was supported by the DFG Project Transformation paralleler Programme and by the Sonderforschungsbereich 342 Werkzeuge und Methoden für die Nutzung paralleler Architekturen  相似文献   

10.
This paper discusses some of the key drivers that will enable businesses to operate effectively on-line, and looks at how the notion of website will become one of an on-line presence which will support the main activities of an organisation. This is placed in the context of the development of the information society which will allow individuals-as consumers or employees-quick, inexpensive and on-demand access to vast quantities of entertainment, services and information. The paper draws on an example of these developments in Australasia.  相似文献   

11.
In Response to Elliott and Valenza, 'And Then There Were None', (1996) Donald Foster has taken strenuous issue with our Shakespeare Clinic's final report, which concluded that none of the testable Shakespeare claimants, and none of the Shakespeare Apocrypha poems and plays – including Funeral Elegy by W.S. – match Shakespeare. Though he seems to accept most of our exclusions – notably excepting those of the Elegy and A Lover's Complaint – he believes that our methodology is nonetheless fatally flawed by worthless figures ... wrong more often than right, rigorous cherry–picking, playing with a stacked deck, and conveniently exil[ing] ... inconvenient data. He describes our tests as foul vapor and methodological madness.We believe that this criticism is seriously overdrawn, and that our tests and conclusions have emerged essentially intact. By our count, he claims to have found 21 errors of consequence in our report. Only five of these claims, all trivial, have any validity at all. If fully proved, they might call for some cautions and slight refinements for five of our 54 tests, but in no case would they come close to invalidating the questioned test. The remaining 49 tests are wholly intact. Total erosion of our findings from the Foster critique could amount, at most, to half of one percent. None of his accusations of cherry–picking, deck–stacking, and evidence–ignoring are substantiated.  相似文献   

12.
Summary Reasoning about programs involves some logical framework which seems to go beyond classical predicate logic. LAR is an extension of predicate logic by additional concepts which are to formalize our natural reasoning about algorithms. Semantically, this extension introduces an underlying time scale on which formulas are considered and time shifting connectives. Besides a full model-theoretic treatment, a consistent and complete formal system for LAR is given. The pure logical system can serve as a basis for various theories. As an example, a theory of while program schemes is developed which contains Hoare's correctness proof system.  相似文献   

13.
Pizer and Eberly introduced the core as the analogue of the medial axis for greyscale images. For two-dimensional images, it is obtained as the ridge of a medial function defined on 2 + 1-dimensional scale space. The medial function is defined using Gaussian blurring and measures the extent to which a point is in the center of the object measured at a scale. Numerical calculations indicate the core has properties quite different from the medial axis. In this paper we give the generic properties of ridges and cores for two-dimensional images and explain the discrepancy between core and medial axis properties. We place cores in a larger relative critical set structure, which coherently relates disjoint pieces of core. We also give the generic transitions which occur for sequences of images varying with a parameter such as time. The genericity implies the stability of the full structure in any compact viewing area of scale space under sufficiently small L2 perturbations of the image intensity function. We indicate consequences for finding cores and also for adding markings to completely determine the structure of the medial function.  相似文献   

14.
Conclusion There has been no attempt in this introduction to put forward a particular method for dealing with these challenges nor to assess the full contribution of the articles in this issue. This outline and discussion has been intended merely to stimulate interdisciplinary debate and provide some of the background to assist in making this possible. A full account would at least have involved a broader review of the background of McLoughlin and Aicholzer and Schienstock in developments within the industrial sociology or industrial relations discipline. Their contributions do, however, provide a good introduction to the traditions within which they are working and so it is not necessary to provide more information here. It is nevertheless important to note in McLoughlin's case that his analysis of technological systems and system architectures is based on earlier work by McLoughlin and his colleagues, cited in the article, on the complex nature of engineering systems and the importance of taking this complexity into account in any discussion of the impact of technology on organisation. If the result of this issue is the stimulation of system designers to read further in such areas or the encouragement of industrial sociologists to become more involved in research directed towards human-oriented system design then it will have served its purpose.  相似文献   

15.
Processing of a set of multi-level digital certificates, particularly path construction and validation, can be excessively resource consuming, and even impractical in some cases. This article introduces classifications of certificate sets as minimal, surplus, and deficient and explains the new paradigm of a recursive certificate structure designed to provide the equivalent of a minimal set of conventional certificates containing only the necessary and sufficient information to minimize the effort to validate a certificate sequence, with a potential avoidance of duplication of validation previously handled by related Certification Authorities.  相似文献   

16.
We analyze four nce Memed novels of Yaar Kemal using six style markers: most frequent words, syllable counts, word type – or part of speech – information, sentence length in terms of words, word length in text, and word length in vocabulary. For analysis we divide each novel into five thousand word text blocks and count the frequencies of each style marker in these blocks. The style markers showing the best separation are most frequent words and sentence lengths. We use stepwise discriminant analysis to determine the best discriminators of each style marker. We then use these markers in cross validation based discriminant analysis. Further investigation based on multiple analysis of variance (MANOVA) reveals how the attributes of each style marker group distinguish among the volumes.  相似文献   

17.
New algorithms for stochastic approximation under input disturbance are designed. For the multidimensional case, they are simple in form, generate consistent estimates for unknown parameters under almost arbitrary disturbances, and are easily incorporated in the design of quantum devices for estimating the gradient vector of a function of several variables.  相似文献   

18.
Summary Equivalence is a fundamental notion for the semantic analysis of algebraic specifications. In this paper the notion of crypt-equivalence is introduced and studied w.r.t. two loose approaches to the semantics of an algebraic specification T: the class of all first-order models of T and the class of all term-generated models of T. Two specifications are called crypt-equivalent if for one specification there exists a predicate logic formula which implicitly defines an expansion (by new functions) of every model of that specification in such a way that the expansion (after forgetting unnecessary functions) is homologous to a model of the other specification, and if vice versa there exists another predicate logic formula with the same properties for the other specification. We speak of first-order crypt-equivalence if this holds for all first-order models, and of inductive crypt-equivalence if this holds for all term-generated models. Characterizations and structural properties of these notions are studied. In particular, it is shown that first order crypt-equivalence is equivalent to the existence of explicit definitions and that in case of positive definability two first-order crypt-equivalent specifications admit the same categories of models and homomorphisms. Similarly, two specifications which are inductively crypt-equivalent via sufficiently complete implicit definitions determine the same associated categories. Moreover, crypt-equivalence is compared with other notions of equivalence for algebraic specifications: in particular, it is shown that first-order cryptequivalence is strictly coarser than abstract semantic equivalence and that inductive crypt-equivalence is strictly finer than inductive simulation equivalence and implementation equivalence.  相似文献   

19.
This work is about a real-world application of automated deduction. The application is the management of documents (such as mathematical textbooks) as they occur in a readily available tool. In this Slicing Information Technology tool, documents are decomposed (sliced) into small units. A particular application task is to assemble a new document from such units in a selective way, based on the user's current interest and knowledge. It is argued that this task can be naturally expressed through logic, and that automated deduction technology can be exploited for solving it. More precisely, we rely on first-order clausal logic with some default negation principle, and we propose a model computation theorem prover as a suitable deduction mechanism. Beyond solving the task at hand as such, with this work we contribute to the quest for arguments in favor of automated deduction techniques in the real world. Also, we argue why we think that automated deduction techniques are the best choice here.  相似文献   

20.
The transition ruleF of a cellular automaton may sometimes be regarded as a rule of growth of a crystal from a seed. A study is made of the iterates,F,F 2 .For certain one-dimensional growth rules, the limiting shapes of the crystals are computed, and an asymptotic formula for the size of the crystal as a function of time is obtained.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号