首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
Bookisms:     
In this issue we review three books, two of which deal with knowledge and knowledge management and the third deals with the many collaboration techniques coming to market. The first two are solid works by people who are leaders in the field, including Laurence Prusak, coauthor with Tom Davenport of the most widely read Knowledge management book (Davenport and Prusak, 1995) published in the United States (see review in Volume 15, No. 3, Summer 1998 issue of ISM) and Ikujiro Nonaka (1991) from Japan who wrote the original knowledge management article in Harvard Business Review. The third book, Wikinomics, by Don Tapscott and coauthor, praises collaboration developments from Wikis to MySpace and beyond, written in a breathless, advertising style that may not appeal to everyone.  相似文献   

2.
The Single European Market is both the single largest opportunity and the single largest threat to face European companies in the 1990s. The survival of many companies will depend on how well they address these opportunities and react to these threats. First, they need to understand the Single Market issues and legislation, but expertise in these areas is expensive and rare. Many organisations, particularly smaller ones or those affected by the recession cannot afford to pay for this experience.This paper describes PHAROS, an Expert System which advises small to medium sized companies on how changes in Single European Market and Environmental legislation will affect their business. PHAROS was developed by National Westminster Bank and Ernst & Young Management Consultants. To date, PHAROS has cost over £2 million to develop and distribute.PHAROS achieved international acclaim when it was presented at the American Association of Artificial Intelligence (AAAI) 1992 conference as one of the most Innovative Applications of Artificial Intelligence. PHAROS was also a medal winner in the 1992 British Computer Society Awards.  相似文献   

3.
Bookisms:     
In this issue we review three books, two of which deal with knowledge and knowledge management and the third deals with the many collaboration techniques coming to market. The first two are solid works by people who are leaders in the field, including Laurence Prusak, coauthor with Tom Davenport of the most widely read Knowledge management book (Davenport and Prusak, 1995) published in the United States (see review in Volume 15, No. 3, Summer 1998 issue of ISM) and Ikujiro Nonaka (1991) Nonaka, I. 1991. The knowledge creating company.. Harvard Business Review, : 96104. November-December). [Google Scholar] from Japan who wrote the original knowledge management article in Harvard Business Review. The third book, Wikinomics, by Don Tapscott and coauthor, praises collaboration developments from Wikis to MySpace and beyond, written in a breathless, advertising style that may not appeal to everyone.  相似文献   

4.
5.
In this paper we consider a k-out-of-n: G system with repair under D-policy. According to this policy whenever the workload exceeds a threshold D a server is called for repair and starts repair one at a time. He is sent back as soon as all the failed units are repaired. The repaired units are assumed to be as good as new. The repair time and failure time distributions are assumed to be exponential. We obtain the system state distribution, system reliability, expected length of time the server is continuously available, expected number of times the system is down in a cycle and several other measures of performance. We compute the optimal D value which maximizes a suitably defined cost function.Scope and purposeThis paper considers a repair policy, called D-policy, of a k-out-of-n: G system. In a k-out-of-n: G system, the system functions as long as there are atleast k operational units. The server activation cost is high once it becomes idle due to all failed units repaired. Hence it is activated when the accumulated amount of work reaches D. This paper examines the optimal D-value by bringing in costs such as the cost of system being down, the server activation cost. Activating the server the moment the first failure takes place may involve very heavy fixed cost per cycle (a cycle is the duration from a point at which the server becomes idle to the next epoch at which it becomes idle after being activated). The other extreme of server activation only after nk+1 units fail leads to the system being down for a long duration in each cycle. Hence the need for the optimal D-policy. A brief account of k-out-of-n: G system can be had in Ross (Ross, SM. Introduction to probability models, 6th ed., New York: Academic Press, 1997). The results obtained here find direct applications in reliability engineering, Production systems, Satellite communication, etc.  相似文献   

6.
The notion of confluence is studied on the context of bigraphs. Confluence will be important in modelling real-world systems, both natural (as in biology) and artificial (as in pervasive computing). The paper uses bigraphs in which names have multiple locality; this enables a formulation of the lambda calculus with explicit substitutions. The paper reports work in progress, seeking conditions on a bigraphical reactive system that are sufficient to ensure confluence; the conditions must deal with the way that bigraphical redexes can be intricately intertwined. The conditions should also be satisfied by the lambda calculus. After discussion of these issues, two conjectures are put forward.  相似文献   

7.
This paper presents an approach for localization using geometric features from a 360° laser range finder and a monocular vision system. Its practicability under conditions of continuous localization during motion in real time (referred to as on-the-fly localization) is investigated in large-scale experiments. The features are infinite horizontal lines for the laser and vertical lines for the camera. They are extracted using physically well-grounded models for all sensors and passed to a Kalman filter for fusion and position estimation. Positioning accuracy close to subcentimeter has been achieved with an environment model requiring 30 bytes/m2. Already with a moderate number of matched features, the vision information was found to further increase this precision, particularly in the orientation. The results were obtained with a fully self-contained system where extensive tests with an overall length of more than 6.4 km and 150,000 localization cycles have been conducted. The final testbed for this localization system was the Computer 2000 event, an annual computer tradeshow in Lausanne, Switzerland, where during 4 days visitors could give high-level navigation commands to the robot via a web interface. This gave us the opportunity to obtain results on long-term reliability and verify the practicability of the approach under application-like conditions. Furthermore, general aspects and limitations of multisensor on-the-fly localization are discussed.  相似文献   

8.
This volume contains the Proceedings of the Second Workshop on Coalgebraic Methods in Computer Science (CMCS'99).The Workshop was held in Amsterdam, The Netherlands, on March 20 and 21, 1999, as a satellite event to the European Joint Conferences on Theory and Practice of Software (ETAPS'99). The first CMCS workshop was held one year earlier as a satellite event to ETAPS'98.  相似文献   

9.
The usual implementation of real numbers in today's computers as floating point numbers has the well-known deficiency that most numbers can only be represented up to some fixed accuracy. Consequently, even the basic arithmetic operations cannot be performed exactly, leading to the ubiquitous round-off errors. This is a serious problem in all disciplines where high accuracy calculations are required. One of the ultimate goals of the theoretical and practical research in this area is to overcome these problems by improving the present implementations and algorithms or providing alternatives.This volume contains the proceedings of the Workshop on Real Number Computation which was held in June 19-20, 1998, in Indianapolis, Indiana, USA. The workshop preceded the IEEE Symposium on Logic in Computer Science (LICS).The meeting aimed to present an introduction to the interdisciplinary area of Real Number Computation. The subject is understood in a broad sense and covers various different fields like Recursion Theory, Interval Analysis, Computer Arithmetic, Semantics of Programming Languages, Number Theory, and Numerical Analysis. The workshop was meant to provide researchers from these different communities an opportunity to meet and exchange ideas.More than 30 participants from eight countries made this workshop a succesful event.Contributions to the proceedings were invited after the workshop from all participants and were sent to referees. Comments and suggestions were forwarded to the authors who thus had a chance to amend their texts. The final versions were accepted by the editors. We would like to take this opportunity to thank all referees for their time and cooperation, and Michael Mislove for his invitation to publish the proceedings in the ENTCS series.23rd August 1999Abbas Edalat, David Matula and Philipp Sünderhauf, Guest Editors  相似文献   

10.
Orchestration is an approach to Technology Enhanced Learning that emphasizes attention to the challenges of classroom use of technology, with a particular focus on supporting teachers' roles. The present collection of papers on orchestration highlights broad agreement that classrooms are variable and complex and that teachers have an important role in adapting materials for use in their own classrooms. The synthesis also shows a difference of opinions in how useful “orchestration” is as a metaphor, the proper scope of issues to include when studying orchestration, and how to approach design. Despite the lack of consensus, orchestration is a timely and important shift of focus and all of the approaches merit further exploration. The field shows healthy self-criticism and debate, which is the hallmark of fields with the potential for great progress.  相似文献   

11.
Lexical collocations have particular statistical distributions. We have developed a set of statistical techniques for retrieving and identifying collocations from large textual corpora. The techniques we developed are able to identify collocations of arbitrary length as well as flexible collocations. These techniques have been implemented in a lexicographic tool, Xtract, which is able to automatically acquire collocations with high retrieval performance. Xtract works in three stages. The first stage is based on a statistical technique for identifying word pairs involved in a syntactic relation. The words can appear in the text in any order and can be separated by an arbitrary number of other words. The second stage is based on a technique to extract n-word collocations (or n-grams) in a much simpler way than related methods. These collocations can involve closed class words such as particles and prepositions. A third stage is then applied to the output of stage one and applies parsing techniques to sentences involving a given word pair in order to identify the proper syntactic relation between the two words. A secondary effect of the third stage is to filter out a number of candidate collocations as irrelevant and thus produce higher quality output. In this paper we present an overview of Xtract and we describe several uses for Xtract and the knowledge it retrieves such as language generation and machine translation.Frank Smadja is in the Department of Computer Science at Columbia University and has been working on lexical collocations for his doctoral thesis.  相似文献   

12.
The need to avoid redundant efforts in software development has been recognized for a long time. Currently, work is focused on the generation of products that are designed to be reused. A reference architecture for robot teleoperation systems has been developed using the domain-engineering process and certain architectural patterns. The architecture has been applied successfully for the development of different teleoperation platforms used in the maintenance activities of nuclear power plants. In particular, this paper presents how the reference architecture has been implemented in different systems, such as the Remotely Operated Service Arm (ROSA), the Teleoperated and Robotized System for Maintenance Operation in Nuclear Power Plants Vessels (TRON) and the Inspection Retrieving Vehicle (IRV).  相似文献   

13.
The primary aim of this contribution is to provide an editorial introduction to this Special Issue ofMachine Translation dedicated to Evaluation. The intention is to describe the rationale for the Issue, outline the various contributions of the papers in this issue, and try to situate them in a wider context. As part of providing this wider context, we give an overview and assessment of the main current approaches to Evaluation of Natural Language Processing, and especially Machine Translation systems.  相似文献   

14.
The theory of reinforcement learning (RL) was originally motivated by animal learning of sequential behavior, but has been developed and extended in the field of machine learning as an approach to Markov decision processes. Recently, a number of neuroscience studies have suggested a relationship between reward-related activities in the brain and functions necessary for RL. Regarding the history of RL, we introduce in this article the theory of RL and present two engineering applications. Then we discuss possible implementations in the brain.  相似文献   

15.
Computing and Visualization in Science - In the original publication of the article the authorship was published incorrectly. The correct authorship is given in this correction  相似文献   

16.
After a brief review and classification of the known generators, this part I of 2 parts paper investigates carefully a new programmable generator of general ordinary renewal processes in discrete time. The generator is based on the concept of failure rate and its implementation is completely digital. The effects of time and amplitude quantization are analyzed in detail. Statistical tests, checking distribution and independence, are partly developed and then applied to numerous examples; results are satisfactory. The distributions of the generated random time intervals are not truncated.Thereafter the generator of renewal processes is extended to allow the hardware simulation of general semi-Markov processes in discrete time. (The generation of related processes will be considered in part II, together with applications.)  相似文献   

17.
18.
Lexical collocations have particular statistical distributions. We have developed a set of statistical techniques for retrieving and identifying collocations from large textual corpora. The techniques we developed are able to identify collocations of arbitrary length as well as flexible collocations. These techniques have been implemented in a lexicographic tool, Xtract, which is able to automatically acquire collocations with high retrieval performance. Xtract works in three stages. The first stage is based on a statistical technique for identifying word pairs involved in a syntactic relation. The words can appear in the text in any order and can be separated by an arbitrary number of other words. The second stage is based on a technique to extract n-word collocations (or n-grams) in a much simpler way than related methods. These collocations can involve closed class words such as particles and prepositions. A third stage is then applied to the output of stage one and applies parsing techniques to sentences involving a given word pair in order to identify the proper syntactic relation between the two words. A secondary effect of the third stage is to filter out a number of candidate collocations as irrelevant and thus produce higher quality output. In this paper we present an overview of Xtract and we describe several uses for Xtract and the knowledge it retrieves such as language generation and machine translation.  相似文献   

19.
e-ducation: research and practice   总被引:1,自引:0,他引:1  
Abstract This paper proposes an integrated approach for information technology in an educational context. The paper suggests a framework for the design of computer assisted learning activities — e-ducation. The framework captures three contemporary interrelated aspects of teaching and learning and is more pedagogical than it is analytical. The three aspects covered in the e-ducation framework are electronic, engaged and empowered. An implementation of the framework is used to illustrate how e-ducation can be applied in educational research and practice. The paper concludes that the e-ducation framework contributes to both educational research and educational practice.  相似文献   

20.
Multimedia Tools and Applications - In the original publication, Fig. 12 was incorrectly presented. The plot line and legends of Fig. 12a, c, e and f should not overlap. The original article was...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号