首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.
Enterprise modelling and information systems modelling have traditionally utilized techniques developed in the earlier disciplines of systems analysis and operational analysis. However, these tools have proved insufficient even for information systems modelling and their inadequacies make them less than ideal for enterprise modelling. Furthermore, it has proved difficult to integrate the techniques into a uniform framework representation. Extensive research since the early 1980s has produced support tools for information systems engineering in the academic sector which have generally failed to reach widespread commercial use. Commercially developed integrated support systems aimed at enterprise modelling and information system modelling use traditional techniques, with a lack of formalism. The purpose of this paper is to introduce a technique which overcomes the major inadequacies and which provides an integrating framework to represent both the information and the processing, at enterprise and systems modelling levels of abstraction. The technique is based on the use of a transition network, but extended to represent enterprise and system models in a meaningful way. The use of an intelligent repository, with associated processing of the formally defined requirements, specification and design statements, is novel, and provides the additional support to make the technique amenable to handling the design process from informal to formal specification. The use of a graphical user interface, linked directly to the deductive system and repository, ensures intuitive ease of use.  相似文献   

2.
An object-oriented framework for building computer based modelling tools for water resource planning is presented. The focus is on building a program for flood calculations in river systems with several reservoirs and water transfer structures. The foundation for the flood model is a general application framework for building hydrological modelling tools. The general framework provides the user with tools for describing the structural components of the hydrological system, their relation in the system topology and controlling the behaviour of the system during simulation. Hydrological models are often data intensive, and the framework is equipped with tools to handle both time series and spatially-distributed data efficiently. During the development, effort has been put into supporting future changes and extensions to the model system, as well as creating sound reusable components that will benefit future development and maintenance. A flood modelling application in the Norwegian river Gudbrandsdalslågen is described to illustrate the use of the toolkit.  相似文献   

3.
Our research moves from three fundamental considerations that concern the modelling and engineering of complex systems. First, organization, coordination and security are strictly related issues that should be modelled in a uniform and coherent framework. Second, models, technologies and methodologies should come hand in hand, so that abstractions used in the analysis and design stages should be still “alive and kicking” at development and execution time. Third, the general non-formalisability of complex systems should not prevent us from using formal tools whenever useful, such as in proving or ensuring properties of limited but meaningful portions of a system.By focussing on multi-agent systems, we discuss the notion of Agent Coordination Context (ACC) as an abstraction that (i) works as an organization and security abstraction, (ii) integrates well with abstractions provided by coordination infrastructures, and (iii) covers the engineering process from design to deployment. In particular, in this paper we study the syntax and semantics of a language for ACCs specification, exploiting typical process algebra techniques. Accordingly, we show that process algebras are a suitable tool for both specification and enactment of security and coordination policies through ACCs.  相似文献   

4.
This paper proposes an integrated modelling framework for the analysis of manufacturing systems that can increase the capacity of modelling tools for rapidly creating a structured database with multiple detail levels and thus obtain key performance indicators (KPIs) that highlight possible areas for improvement. The method combines five important concepts: hierarchical structure, quantitative/qualitative analysis, data modelling, manufacturing database and performance indicators. It enables methods to build a full information model of the manufacturing system, from the shopfloor functional structure to the basic production activities (operations, transport, inspection, etc.). The proposed method is based on a modified IDEF model that stores all kind of quantitative and qualitative information. A computer-based support tool has been developed to connect with the IDEF model, creating automatically a relational database through a set of algorithms. This manufacturing datawarehouse is oriented towards obtaining a rapid global vision of the system through multiple indicators. The developed tool has been provided with different scorecard panels to make use of KPIs to decide the best actions for continuous improvement. To demonstrate and validate both the proposed method and the developed tools, a case study has been carried out for a complex manufacturing system.  相似文献   

5.
Modelling capability for products to be designed and manufactured plays an important role in order to effectively construct and utilize CAD/CAM systems. Product models should represent all the information about products, which is utilized in manufacturing processes. Therefore it is required that they describe functional structures of machine products, and include not only geometric information but also various non-geometric data, such as physical, technological and management data. Presently there do not seem to exist definite methods or theories for constructing product models. In this paper, we first investigate the whole manufacturing process, and propose a system structure for integration of CAD/CAM, in which product modelling plays a fundamental role. Then requirements for product modelling are studied thoroughly, and a new representation framework for product models is proposed. It consists of an object concept called frame, relations among frames and attributes, and it can incorporate the existing various modelling capabilities, such as solid modelling. We use this representation framework in combination with our solid modelling package GEOMAP-III, and show the effectiveness of this approach by performing illustrative design experiments.  相似文献   

6.
Coloured Petri Nets (CPNs) are a graphically oriented modelling language for concurrent systems based on Petri Nets and the functional programming language Standard ML. Petri Nets provide the primitives for modelling concurrency and synchronisation. Standard ML provides the primitives for modelling data manipulation and for creating compact and parameterisable CPN models.Functional programming and Standard ML have played a major role in the development of CPNs and the CPN computer tools supporting modelling, simulation, verification, and performance analysis of concurrent systems. At the modelling language level, Standard ML has extended Petri Nets with the practical expressiveness required for modelling systems of the size and complexity found in typical industrial projects. At the implementation level, Standard ML has been used to implement the formal semantics of CPNs that provide the theoretical foundation of the CPN computer tools.This paper provides an overview of how functional programming and Standard ML are applied in the CPN modelling language and the supporting computer tools. We give a detailed presentation of the key algorithms and techniques used for implementing the formal semantics of CPNs, and we survey a number of case studies where CPNs have been used for the design and analysis of systems. We also demonstrate how the use of a Standard ML programming environment has allowed Petri Nets to be used for the implementation of systems.  相似文献   

7.
Successful software development is becoming increasingly important to many companies. However, most projects fail to meet their targets, highlighting the inadequacies of traditional project management techniques in this unique setting. Despite breakthroughs in software engineering, management methodologies have not improved, and the major opportunities for better results are now in this area. Poor strategic management and related human factors have been cited as a major cause for failures, which traditional techniques cannot incorporate explicitly. System dynamics (SD) aims to model the behaviour of complex socio-economic systems; there has been a number of applications to software project management. SD provides an alternative view in which the major project influences are considered and quantified explicitly. Grounded on a holistic perspective it avoids consideration of the detail required by traditional tools, looking at the key aspects of the general project behaviour. However, if SD is to play a key role in software project management it needs to be embedded within the traditional decision-making framework. The authors developed a conceptual integrated model, the SYDPIM, which has been tested and improved within a large on-going software project. Such a framework specifies the roles of SD models, how they are to be used within the traditional management process, how they exchange information with the traditional models, and a general method to support model development.  相似文献   

8.
Comprehensive and elaborate systems analysis techniques have been developed in the past of routine and operational information systems. Developing support systems for organizational decision-making requires new tools and methodologies. We present a new framework for data collection and decision analysis which is useful for developing decision support systems. This task analysis methodology encompasses (1) event analysis, (2) participant analysis, and (3) decision content analysis. With a proper coding manual, it provides a framework for collecting relevant and detailed information required for decision support design and implementation. Further research is suggested for application and evaluation of the methodology in real-life DSS environments.  相似文献   

9.
10.
The widespread use of embedded systems requires the creation of industrial software technology that will make it possible to engineer systems being correct by construction. That can be achieved through the use of validated (trusted) components, verification of design models, and automatic configuration of applications from validated design models and trusted components. This design philosophy has been instrumental for developing COMDES—a component-based framework for distributed embedded control systems. A COMDES application is conceived as a network of embedded actors that are configured from instances of reusable, executable components—function blocks (FBs). System actors operate in accordance with a timed multitasking model of computation, whereby I/O signals are exchanged with the controlled plant at precisely specified time instants, resulting in the elimination of I/O jitter. The paper presents an analysis technique that can be used to validate COMDES design models in SIMULINK. It is based on a transformation of the COMDES design model into a SIMULINK analysis model, which preserves the functional and timing behaviour of the application. This technique has been employed to develop a feasible (light-weight) analysis method based on runtime observers. The latter are conceived as special-purpose actors running in parallel with the application actors, while checking system properties specified in Linear Temporal Logic. Observers are configured from reusable FBs that can be exported to SIMULINK in the same way as application components, making it possible to analyze system properties via simulation. The discussion is illustrated with an industrial case study—a Medical Ventilator Control System, which has been used to validate the developed design and analysis methods.  相似文献   

11.
The attack on September 11, 2001 set off numerous efforts to counter terrorism and insurgencies. Central to these efforts has been the drive to improve data collection and analysis. Section 1 summarizes some of the more notable improvements among U.S. government agencies as they strive to develop their capabilities. Although progress has been made, daunting challenges remain. Section 2 reviews the basic challenges to data collection and analysis focusing in some depth on the difficulties of data integration. Three general approaches to data integration are identified—discipline-centric, placed-centric and virtual. A summary of the major challenges in data integration confronting field operators in Iraq and Afghanistan illustrates the work that lies ahead. Section 3 shifts gears to focus on the future and introduces the discipline of Visual Analytics—an emerging field dedicated to improving data collection and analysis through the use of computer-mediated visualization techniques and tools. The purpose of Visual Analytics is to maximize human capability to perceive, understand, reason, make judgments and work collaboratively with multidimensional, conflicting, and dynamic data. The paper concludes with two excellent examples of analytic software platforms that have been developed for the intelligence community—Palantir and ORA. They signal the progress made in the field of Visual Analytics to date and illustrate the opportunities that await other IS researchers interested in applying their knowledge and skills to the tracking and disrupting of dark networks.  相似文献   

12.
The contributors to this special issue focus on socio-technical and soft approaches to information requirements elicitation and systems development. They represent a growing body of research and practice in this field. This review presents an overview and analysis of the salient themes within the papers encompassing their common underlying framework, the methodologies and tools and techniques presented, the organisational situations in which they are deployed and the issues they seek to address. It will be argued in the review that the contributions to this special edition exemplify the ‘post-methodological era’ and the ‘contingency approaches’ from which it is formed.  相似文献   

13.
This paper describes general principles and example results of a new software tool being developed for physiologically-based modelling of biomedical systems within a multidisciplinary framework. The aim is to overcome some limitations of currently available software designed either for general purpose or for highly specialised modelling applications. In fact, general purpose tools usually impose explicit coding of mathematical model equations or non-intuitive system representations, whereas specialised software use domain-specific notations that allow efficient and convenient model building only for special classes of systems. The aim of the present study is to pursue intuitive representation of various, possibly interacting, types of biological systems described as interconnected physical components, such as mass and energy storage elements, active and passive transport or biochemical transformations. The presented software generates automatically the mathematical model equations that can be coded in different formats. This allows interoperability with other existing software, e.g. for numerical simulation, symbolic analysis or text processing. A multi-domain structural language has been defined for an intuitive, hierarchical and self-explanatory specification of physiological models. The proposed strategies may become useful for dissemination and integration of multidisciplinary modelling knowledge.  相似文献   

14.
This interdisciplinary research is based on the application of unsupervized connectionist architectures in conjunction with modelling systems and on the determining of the optimal operating conditions of a new high precision industrial process known as laser milling. Laser milling is a relatively new micro‐manufacturing technique in the production of high‐value industrial components. The industrial problem is defined by a data set relayed through standard sensors situated on a laser‐milling centre, which is a machine tool for manufacturing high‐value micro‐moulds, micro‐dies and micro‐tools. The new three‐phase industrial system presented in this study is capable of identifying a model for the laser‐milling process based on low‐order models. The first two steps are based on the use of unsupervized connectionist models. The first step involves the analysis of the data sets that define each case study to identify if they are informative enough or if the experiments have to be performed again. In the second step, a feature selection phase is performed to determine the main variables to be processed in the third step. In this last step, the results of the study provide a model for a laser‐milling procedure based on low‐order models, such as black‐box, in order to approximate the optimal form of the laser‐milling process. The three‐step model has been tested with real data obtained for three different materials: aluminium, cooper and hardened steel. These three materials are used in the manufacture of micro‐moulds, micro‐coolers and micro‐dies, high‐value tools for the medical and automotive industries among others. As the model inputs are standard data provided by the laser‐milling centre, the industrial implementation of the model is immediate. Thus, this study demonstrates how a high precision industrial process can be improved using a combination of artificial intelligence and identification techniques.  相似文献   

15.
Recent development in computer hardware has brought more widespread emergence of shared memory, multi-core systems. These architectures offer opportunities to speed up various tasks—model checking and reachability analysis among others. In this paper, we present a design for a parallel shared memory LTL model checker that is based on a distributed memory algorithm. To improve the scalability of our tool, we have devised a number of implementation techniques which we present in this paper. We also report on a number of experiments we conducted to analyse the behaviour of our tool under different conditions using various models. We demonstrate that our tool exhibits significant speedup in comparison with sequential tools, which improves the workflow of verification in general.  相似文献   

16.
Component technology is increasingly used to develop modular, configurable, and reusable systems. The problem of design and implement component-based systems is addressed by many models, methodologies, tools, and frameworks. On the contrary, analysis and test are not adequately supported yet. In general, a coherent fault taxonomy is a key starting point for providing techniques and methods for assessing the quality of software and in particular of component-based systems. This paper proposes a fault taxonomy to be used to develop and evaluate testing and analysis techniques for component-based software.  相似文献   

17.
The use of information systems in manufacturing applications has dramatically changed over the last few years. The design and implementation of somewhat dated relational databases has been replaced by the generation of information models, that can be simultaneously used for the development of information systems and satisfy their integration requirements. Over the last ten years the authors have been involved in a series of research programmes focusing on the design and operation of flexible machining cells. The use of information systems has been a central theme and the enabling technology to achieve a number of novel design concepts and operational strategies for such cells. The initial research was based on the utilization of relational databases to integrate a variety of modelling and design tools. However, the additional effort required to integrate such databases to manufacturing software tools, in the form of developing file translators, information gateways and interfac es, has made the authors adopt a new approach. With this approach the information requirements are represented in a neutral format within a data model, using a formal data specification language developed by the Standards for the Exchange of Product (STEP) committee. This paper describes these changes in the design and implementation of information systems in manufacturing applications, and provides an initial view of future research requirements.  相似文献   

18.
Integration and control of intelligence in distributed manufacturing   总被引:2,自引:0,他引:2  
The area of intelligent systems has generated a considerable amount of interest—occasionally verging on controversy—within both the research community and the industrial sector. This paper aims to present a unified framework for integrating the methods and techniques related to intelligent systems in the context of design and control of modern manufacturing systems. Particular emphasis is placed on the methodologies relevant to distributed processing over the Internet. Following presentation of a spectrum of intelligent techniques, a framework for integrated analysis of these techniques at different levels in the context of intelligent manufacturing systems is discussed. Integration of methods of artificial intelligence is investigated primarily along two dimensions: the manufacturing product life-cycle dimension, and the organizational complexity dimension. It is shown that at different stages of the product life-cycle, different intelligent and knowledge-oriented techniques are used, mainly because of the varied levels of complexity associated with those stages. Distribution of the system architecture or system control is the most important factor in terms of demanding the use of the most up-to-date distributed intelligence technologies. A tool set for web-enabled design of distributed intelligent systems is presented. Finally, the issue of intelligence control is addressed. It is argued that the dominant criterion according to which the level of intelligence is selected in technological tasks is the required precision of the resulting operation, related to the degree of generalization required by the particular task. The control of knowledge in higher-level tasks has to be executed with a strong involvement of the human component in the feedback loop. In order to facilitate the human intervention, there is a need for readily available, user-transparent computing and telecommunications infrastructure. In its final part, the paper discusses currently emerging ubiquitous systems, which combine this type of infrastructure with new intelligent control systems based on a multi-sensory perception of the state of the controlled process and its environment to give us tools to manage information in a way that would be most natural and easy for the human operator.  相似文献   

19.
Integrated assessment and its inherent platform, integrated modelling, present an opportunity to synthesize diverse knowledge, data, methods and perspectives into an overarching framework to address complex environmental problems. However to be successful for assessment or decision making purposes, all salient dimensions of integrated modelling must be addressed with respect to its purpose and context. The key dimensions include: issues of concern; management options and governance arrangements; stakeholders; natural systems; human systems; spatial scales; temporal scales; disciplines; methods, models, tools and data; and sources and types of uncertainty. This paper aims to shed light on these ten dimensions, and how integration of the dimensions fits in the four main phases in the integrated assessment process: scoping, problem framing and formulation, assessing options, and communicating findings. We provide examples of participatory processes and modelling tools that can be used to achieve integration.  相似文献   

20.
The environmental modelling community has developed many models with varying levels of complexity and functionality. Many of these have overlapping problem domains, have very similar ‘science’ and yet are not compatible with each other. The modelling community recognises the benefits to model exchange and reuse, but often it is perceived to be easier to (re)create a new model than to take an existing one and adapt it to new needs.Many of these third party models have been incorporated into the Agricultural Production Systems Simulator (APSIM), a farming systems modelling framework. Some of the issues encountered during this process were system boundary issues (the functional boundary between models and sub-models), mixed programming languages, differences in data semantics, intellectual property and ownership.This paper looks at these difficulties and how they were overcome. It explores some software development techniques that facilitated the process and discusses some guidelines that can not only make this process simpler but also move models towards framework independence.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号