首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 121 毫秒
1.
This paper is a discussion about how the Application Perspective works in practice.1 We talk about values and attitudes to system development and computer systems, and we illustrate how they have been carried out in practice by examples from the Florence project.2 The metaphors utensil and epaulet refer to questions about how we conceive the computer system we are to design in the system development process. Our experience is that, in the scientific community, technical challenges mean making computer systems that may be characterised as epaulets: they have technical, fancy features, but are not particularly useful. Making small, simple, but useful computer systems, more like utensils, does not give as much credit even if the development process may be just as challenging.  相似文献   

2.
Modular Control and Coordination of Discrete-Event Systems   总被引:1,自引:0,他引:1  
In the supervisory control of discrete-event systems based on controllable languages, a standard way to handle state explosion in large systems is by modular supervision: either horizontal (decentralized) or vertical (hierarchical). However, unless all the relevant languages are prefix-closed, a well-known potential hazard with modularity is that of conflict. In decentralized control, modular supervisors that are individually nonblocking for the plant may nevertheless produce blocking, or even deadlock, when operating on-line concurrently. Similarly, a high-level hierarchical supervisor that predicts nonblocking at its aggregated level of abstraction may inadvertently admit blocking in a low-level implementation. In two previous papers, the authors showed that nonblocking hierarchical control can be guaranteed provided high-level aggregation is sufficiently fine; the appropriate conditions were formalized in terms of control structures and observers. In this paper we apply the same technique to decentralized control, when specifications are imposed on local models of the global process; in this way we remove the restriction in some earlier work that the plant and specification (marked) languages be prefix-closed. We then solve a more general problem of coordination: namely how to determine a high level coordinator that forestalls conflict in a decentralized architecture when it potentially arises, but is otherwise minimally intrusive on low-level control action. Coordination thus combines both vertical and horizontal modularity. The example of a simple production process is provided as a practical illustration. We conclude with an appraisal of the computational effort involved.  相似文献   

3.
The notion of obvious inference in predicate logic is discussed from the viewpoint of proof-checker applications in logic and mathematics education. A class of inferences in predicate logic is defined and it is proposed to identify it with the class of obvious logical inferences. The definition is compared with other approaches. The algorithm for implementing the obviousness decision procedure follows directly from the definition.  相似文献   

4.
'Racial' disparities among cancers, particularly of the breast and prostate, are something of a mystery. For the US, in the face of slavery and its sequelae, centuries of interbreeding has greatly leavened genetic differences between Blacks and Whites, but marked contrasts in disease prevalence and progression persist. Adjustment for socioeconomic status and lifestyle, while statistically accounting for much of the variance in breast cancer, only begs the question of ultimate causality. Here we propose a more basic biological explanation that extends the theory of immune cognition to include an elaborate tumor control mechanism constituting the principal selection pressure acting on pathologically mutating cell clones. The interplay between them occurs in the context of an embedding, highly structured, system of culturally-specific psychosocial stress. A rate distortion argument finds that larger system able to literally write an image of itself onto the disease process, in terms of enhanced risk behaviour, accelerated mutation rate, and depressed mutation control. The dynamics are analogous to punctuated equilibrium in simple evolutionary systems, accounting for the staged nature of disease progression. We conclude that 'social exposures' are, for human populations, far more than incidental cofactors in cancer etiology. Rather, they are part of the basic biology of the disorder. The aphorism that culture is as much a part of human biology as the enamel on our teeth appears literally true at a fundamental cellular level.  相似文献   

5.
This paper studies Fool's models of combinatory logic, and relates them to Hindley's D-completeness problem. A fool's model is a family of sets of formulas, closed under condensed detachment. Alternatively, it is a model ofCL in naive set theory. We examine Resolution; and the P-W problem. A sequel shows T is D-complete; also, its extensions. We close with an implementation FMO of these ideas.  相似文献   

6.
Semantics connected to some information based metaphor are well-known in logic literature: a paradigmatic example is Kripke semantic for Intuitionistic Logic. In this paper we start from the concrete problem of providing suitable logic-algebraic models for the calculus of attribute dependencies in Formal Contexts with information gaps and we obtain an intuitive model based on the notion of passage of information showing that Kleene algebras, semi-simple Nelson algebras, three-valued ukasiewicz algebras and Post algebras of order three are, in a sense, naturally and directly connected to partially defined information systems. In this way wecan provide for these logic-algebraic structures a raison dêetre different from the original motivations concerning, for instance, computability theory.  相似文献   

7.
In this paper we use free fall approach to develop a high level control/command strategy for a bipedal robot called BIPMAN, based on a multi-chain mechanical model with a general control architecture. The strategy is composed of three levels: the Legs and arms level, the Coordinator level and the Supervisor level. The Coordinator level is devoted to controlling leg movements and to ensure the stability of the whole biped. Actually perturbation effects threaten the equilibrium of the human robot and can only be compensated using a dynamic control strategy. This one is based on dynamic stability studies with a center of mass acceleration control and a force distribution on each leg and arm. Free fall in the gravity field is assumed to be deeply involved in the human locomotor control. According to studies of this specific motion through a direct dynamic model,the notion of equilibrium classes is introduced. They allow one to define time intervals in which the biped is able to maintain its posture. This notion is used for the definition of a reconfigurable high level control of the robot.  相似文献   

8.
When implementing computational lexicons it is important to keep in mind the texts that a NLP system must deal with. Words relate to each other in many different, often odd ways this information is rarely found in dictionaries, and it is quite hard to deduce a priori. In this paper we present a technique for the acquisition of statistically significant selectional restrictions from corpora and discuss the results of an experimental application with reference to two specific sublaguages (legal and commercial). We show that there are important cooccurrence preferences among words which cannot be established a priori as they are determined for each choice of sublanguage. The method for detecting cooccurrences is based on the analysis of word associations augmented with syntactic markers and semantic tags. Word pairs are extracted by a morphosyntactic analyzer and clustered according to their semantic tags. A statistical measure is applied to the data to evaluate the sigificance of any relations detected. Selectional restrictions are acquired by a two-step process. First, statistically prevailing coarse grained conceptual patterns are used by a linguist to identify the relevant selectional restrictions in sublanguages. Second, semiautomatically acquired coarse selectional restrictions are used as the semantic bias of a system, ARIOSTO_LEX, for the automatic acquisition of a case-based semantic lexicon.  相似文献   

9.
Environmental protection activities in industry have rapidly increased in number over the last years. Additionally, surveys of environmental activities have identified a change in the kind or in the approaches used to environmental problem solving. A new paradigm Clean Technology has been developed which gradually seems to replace the Clean-up Technology paradigm and the older Dilute and Disperse paradigm. The new Clean Technology paradigm brings with it not only a new way of looking at environmental protection, but also a range of rules guiding the application of technology and the design of technological systems. This paper presents a few case studies highlighting and evaluating Clean Technology activities.  相似文献   

10.
In this paper, we propose a statistical method to automaticallyextract collocations from Korean POS-tagged corpus. Since a large portion of language is represented by collocation patterns, the collocational knowledge provides a valuable resource for NLP applications. One difficulty of collocation extraction is that Korean has a partially free word order, which also appears in collocations. In this work, we exploit four statistics, frequency,randomness, convergence, and correlation' in order to take into account the flexible word order of Korean collocations. We separate meaningful bigrams using an evaluation function based on the four statistics and extend the bigrams to n-gram collocations using a fuzzy relation. Experiments show that this method works well for Korean collocations.  相似文献   

11.
Coordinating Multiple Agents via Reinforcement Learning   总被引:2,自引:0,他引:2  
In this paper, we attempt to use reinforcement learning techniques to solve agent coordination problems in task-oriented environments. The Fuzzy Subjective Task Structure model (FSTS) is presented to model the general agent coordination. We show that an agent coordination problem modeled in FSTS is a Decision-Theoretic Planning (DTP) problem, to which reinforcement learning can be applied. Two learning algorithms, coarse-grained and fine-grained, are proposed to address agents coordination behavior at two different levels. The coarse-grained algorithm operates at one level and tackle hard system constraints, and the fine-grained at another level and for soft constraints. We argue that it is important to explicitly model and explore coordination-specific (particularly system constraints) information, which underpins the two algorithms and attributes to the effectiveness of the algorithms. The algorithms are formally proved to converge and experimentally shown to be effective.  相似文献   

12.
The temporal property to-always has been proposed for specifying progress properties of concurrent programs. Although the to-always properties are a subset of the leads-to properties for a given program, to-always has more convenient proof rules and in some cases more accurately describes the desired system behavior. In this paper, we give a predicate transformerwta, derive some of its properties, and use it to define to-always. Proof rules for to-always are derived from the properties ofwta. We conclude by briefly describing two application areas, nondeterministic data flow networks and self-stabilizing systems where to-always properties are useful.  相似文献   

13.
This paper aims to provide a basis for renewed talk about use in computing. Four current discourse arenas are described. Different intentions manifest in each arena are linked to failures in translation, different terminologies crossing disciplinary and national boundaries non-reflexively. Analysis of transnational use discourse dynamics shows much miscommunication. Conflicts like that between the Scandinavian System Development School and the usability approach have less current salience. Renewing our talk about use is essential to a participatory politics of information technology and will lead to clearer perception of the implications of letting new systems becoming primary media of social interaction.  相似文献   

14.
This paper presents generated enhancements for robust two and three-quarter dimensional meshing, including: (1) automated interval assignment by integer programming for submapped surfaces and volumes, (2) surface submapping, and (3) volume submapping. An introduction to the simplex method, an optimization technique of integer programming, is presented. Simplification of complex geometry is required for the formulation of the integer programming problem. A method of i-j unfolding is defined which explains how irregular geometry can be realigned into a simplified form that is suitable for submap interval assignment solutions. Also presented is the processes by which submapping eliminates the decomposition of surface geometry, through a pseudodecomposition process, producing suitable mapped meshes. The process of submapping involves the creation of interpolated virtual edges, user defined vertex types and i-j-k space traversals. The creation of interpolated virtual edges is the method by which submapping automatically subdivides surface geometry. The interpolated virtual edge is formulated according to an interpolation scheme using the node discretization of curves on the surface. User defined vertex types allow direct user control of surface decomposition and interval assignment by modifying i-j-k space traversals. Volume submapping takes the geometry decomposition to a higher level by using mapped virtual surfaces to eliminate decomposition of complex volumes.  相似文献   

15.
T. Cox 《Virtual Reality》2000,5(4):215-222
This paper gives a broad overview of the technology and market for on-line and multiplayer computer gaming. Some economic considerations and their influence on the choice of technologies are examined. Particular attention is given to the massively-multiplayer and persistent world type of games, and the special problems that arise in these environments. Lastly, some ongoing problems are investigated, particularly the thorny issue of cheating in multiplayer games.  相似文献   

16.
The “explicit-implicit” distinction   总被引:3,自引:3,他引:0  
Much of traditional AI exemplifies the explicit representation paradigm, and during the late 1980's a heated debate arose between the classical and connectionist camps as to whether beliefs and rules receive an explicit or implicit representation in human cognition. In a recent paper, Kirsh (1990) questions the coherence of the fundamental distinction underlying this debate. He argues that our basic intuitions concerning explicit and implicit representations are not only confused but inconsistent. Ultimately, Kirsh proposes a new formulation of the distinction, based upon the criterion ofconstant time processing.The present paper examines Kirsh's claims. It is argued that Kirsh fails to demonstrate that our usage of explicit and implicit is seriously confused or inconsistent. Furthermore, it is argued that Kirsh's new formulation of the explicit-implicit distinction is excessively stringent, in that it banishes virtually all sentences of natural language from the realm of explicit representation. By contrast, the present paper proposes definitions for explicit and implicit which preserve most of our strong intuitions concerning straightforward uses of these terms. It is also argued that the distinction delineated here sustains the meaningfulness of the abovementioned debate between classicists and connectionists.  相似文献   

17.
When Physical Systems Realize Functions...   总被引:1,自引:1,他引:0  
After briefly discussing the relevance of the notions computation and implementation for cognitive science, I summarize some of the problems that have been found in their most common interpretations. In particular, I argue that standard notions of computation together with a state-to-state correspondence view of implementation cannot overcome difficulties posed by Putnam's Realization Theorem and that, therefore, a different approach to implementation is required. The notion realization of a function, developed out of physical theories, is then introduced as a replacement for the notional pair computation-implementation. After gradual refinement, taking practical constraints into account, this notion gives rise to the notion digital system which singles out physical systems that could be actually used, and possibly even built.  相似文献   

18.
Speech perception relies on the human ability to decode continuous, analogue sound pressure waves into discrete, symbolic labels (phonemes) with linguistic meaning. Aspects of this signal-to-symbol transformation have been intensively studied over many decades, using psychophysical procedures. The perception of (synthetic) syllable-initial stop consonants has been especially well studied, since these sounds display a marked categorization effect: they are typically dichotomised into voiced and unvoiced classes according to their voice onset time (VOT). In this case, the category boundary is found to have a systematic relation to the (simulated) place of articulation, but there is no currently-accepted explanation of this phenomenon. Categorization effects have now been demonstrated in a variety of animal species as well as humans, indicating that their origins lie in general auditory and/or learning mechanisms, rather than in some phonetic module specialized to human speech processing.In recent work, we have demonstrated that appropriately-trained computational learning systems (neural networks) also display the same systematic behaviour as human and animal listeners. Networks are trained on simulated patterns of auditory-nerve firings in response to synthetic continuua of stop-consonant/vowel syllables varying in place of articulation and VOT. Unlike real listeners, such a software model is amenable to analysis aimed at extracting the phonetic knowledge acquired in training, so providing a putative explanation of the categorization phenomenon. Here, we study three learning systems: single-layer perceptrons, support vector machines and Fisher linear discriminants. We highlight similarities and differences between these approaches. We find that the modern inductive inference technique for small sample sizes of support vector machines gives the most convincing results. Knowledge extracted from the trained machine indicated that the phonetic percept of voicing is easily and directly recoverable from auditory (but not acoustic) representations.  相似文献   

19.
This paper presents an alternative to the speech acts with STRIPS approach to implementing dialogue a fully implemented AI planner which generates and analyses the semantics of utterances using a single linguistic act for all contexts. Using this act, the planner can model problematic conversational situations, including felicitous and infelicitous instances of bluffing, lying, sarcasm, and stating the obvious. The act has negligible effects, and its precondition can always be proved. Speaker maxims enable the speaker to plan to deceive, as well as to generate implicatures, while hearer maxims enable the hearer to recognise deceptions, and interpret implicatures. The planner proceeds by achieving parts of the constructive proof of a goal. It incorporates an epistemic theorem prover, which embodies a deduction model of belief, and a constructive logic.  相似文献   

20.
Given (1) Wittgensteins externalist analysis of the distinction between following a rule and behaving in accordance with a rule, (2) prima facie connections between rule-following and psychological capacities, and (3) pragmatic issues about training, it follows that most, even all, future artificially intelligent computers and robots will not use language, possess concepts, or reason. This argument suggests that AIs traditional aim of building machines with minds, exemplified in current work on cognitive robotics, is in need of substantial revision.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号