首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 23 毫秒
1.
Innovation is a critical factor in the success of industrial companies, and just as important is the need to get innovative products to the market quickly. Therefore, it is important to talk about ‘management of product development time’ because, under this new paradigm, companies able of ‘mastering’ the development time will launch the product into the market just spending the planned time and resources and at the real moment. This will give back to the company higher market share and faster market penetration. The main objective of the project Acceleration of Innovative Ideas into the Market (AIM) is to provide the means of stimulating the creation of innovative ideas in general, and specifically on potential product/process improvements and on problem solving. These ideas are collected throughout the extended enterprise from people involved with the products and processes; this knowledge will be further developed into innovations in a project-basis process.  相似文献   

2.
Engineering design is a knowledge-intensive process that encompasses conceptual design, detailed design, engineering analysis, assembly design, process design, and performance evaluation. Each of these tasks involves various areas of knowledge and experience. The sharing of such knowledge and experience is critical to increasing the capacity for developing products and to increasing their quality. It is also critical to reducing the duration and cost of the development cycle. Accordingly, offering engineering designers various methods for retrieving engineering knowledge is one of the most important tasks in managing engineering knowledge.

This study develops a multi-layer reference design retrieval technology for engineering knowledge management to provide engineering designers with easy access to relevant design and related knowledge. The tasks performed in this research include (i) designing a multi-layer reference design retrieval process, (ii) developing techniques associated with multi-layer reference design retrieval technology, and (iii) implementing a multi-layer reference design retrieval mechanism. The retrieval process contains three main phases—‘customer requirement-based reference design retrieval’, ‘functional requirement-based reference design retrieval’ and ‘functional feature-based reference design retrieval’. This technology involves (1) customer requirement-based reference design retrieval, which involves a structured query model for customer requirements, a case-based representation of designed entities, a customer requirement-based index structure for historical design cases, and customer requirement-based case searching, matching and ranking mechanisms, (2) functional requirement-based reference design retrieval, which includes a structured query model for functional requirements, a functional requirement-based index structure for historical design cases, and functional requirement-based case searching, matching and ranking mechanisms, and (3) functional feature-based reference design retrieval, which is a binary code-based representation for functional features, an ART1 neural network for functional feature-based case clustering and functional feature-based case ranking.  相似文献   


3.
The OVER Project was a collaboration between West Midlands Police, UK, the Centre for Adaptive Systems, and Psychology Division, from the University of Sunderland. The Project was developed primarily to assist the Police with the high volume crime, burglary from dwelling houses. A developed software system enables the trending of historical data, the testing of ‘short term’ hunches, and the development of ‘medium’ and long term’ strategies to burglary and crime reduction, based upon victim, offender, location and details of victimisations. The software utilises mapping and visualisation tools and is capable of a range of sophisticated predictions, tying together statistical techniques with theories from forensic psychology and criminology.

The statistical methods employed (including multi-dimensional scaling, binary logistic regression) and ‘data-mining’ technologies (including neural networks) are used to investigate the impact of the types of evidence available and to determine the causality in this domain. The final predictions on the likelihood of burglary are calculated by combining all of the varying sources of evidence into a Bayesian belief network. This network is embedded in the developed software system, which also performs data cleansing and data transformation for presentation to the developed algorithms.

It is important that derived statistics from the software and predictions are interpretable by the intended users of the decision support system, namely Police sector managers, and this paper includes some of the design decisions based upon the forensic psychology and criminology literature, including the graphical representation of geographic data and presentation of results of analyses.  相似文献   


4.
The project described is concerned with the development of a practical system for interpreting carotid angiograms. The general requirements of such a system are identified and some of the design considerations are discussed. The analysis strategies and types of knowledge used by an expert in recognizing and naming vessels and categorizing abnormalities on a single-plane angiogram are described- Two distinct types of knowledge are identified: ‘facts’ knowledge which is drawn from various areas of science, and ‘strategy’ knowledge which determines how the facts should be used. It is suggested that confidence parameters should be associated with each type of knowledge and these should be used in the control of the analysis of an angiogram. The ‘facts’ and ‘strategy’ knowledge bases have a hierarchical structure which, if exploited, would enable the system to be easily adapted for use in other application areas.  相似文献   

5.
When it comes time to write some computer code the hardware of choice today is a workstation running the Unix operating system. This paper considers the problem of new users faced with Unix. It presents a number of possible solutions and elaborates on one which provides a graphical help system. This latter help system may be accessed via a pointing device which leads initially to a picture of a help structure, nodes in the structure may be selected to display the associated help text. The texts are structure to address the needs of differing users, ‘new’ through to ‘competent’.  相似文献   

6.
ANTS: Agents on Networks, Trees, and Subgraphs   总被引:1,自引:0,他引:1  
Efficient exploration of large networks is a central issue in data mining and network maintenance applications. In most existing work there is a distinction between the active ‘searcher’ which both executes the algorithm and holds the memory and the passive ‘searched graph’ over which the searcher has no control at all. Large dynamic networks like the Internet, where the nodes are powerful computers and the links have narrow bandwidth and are heavily-loaded, call for a different paradigm, in which a noncentralized group of one or more lightweight autonomous agents traverse the network in a completely distributed and parallelizable way. Potential advantages of such a paradigm would be fault tolerance against network and agent failures, and reduced load on the busy nodes due to the small amount of memory and computing resources required by the agent in each node. Algorithms for network covering based on this paradigm could be used in today’s Internet as a support for data mining and network control algorithms. Recently, a vertex ant walk ( ) method has been suggested [I.A. Wagner, M. Lindenbaum, A.M. Bruckstein, Ann. Math. Artificial Intelligence 24 (1998) 211–223] for searching an undirected, connected graph by an a(ge)nt that walks along the edges of the graph, occasionally leaving ‘pheromone’ traces at nodes, and using those traces to guide its exploration. It was shown there that the ant can cover a static graph within time nd, where n is the number of vertices and d the diameter of the graph. In this work we further investigate the performance of the method on dynamic graphs, where edges may appear or disappear during the search process. In particular we prove that (a) if a certain spanning subgraph S is stable during the period of covering, then the method is guaranteed to cover the graph within time nds, where ds is the diameter of S, and (b) if a failure occurs on each edge with probability p, then the expected cover time is bounded from above by nd((logΔ/log(1/p))+((1+p)/(1−p))), where Δ is the maximum vertex degree in the graph. We also show that (c) if G is a static tree then it is covered within time 2n.  相似文献   

7.
We introduce in this paper a nonlinear model predictive control scheme for open-loop stable systems subject to input and state constraints. Closed-loop stability is guaranteed by an appropriate choice of the finite prediction horizon, independent of the specification of the desired control performance. In addition, this control scheme is likely to allow ‘real time’ implementation, because of its computational attractiveness. The theoretical results are demonstrated and discussed with a CSTR control application.  相似文献   

8.
This paper considers the problem of how luggages should be assign to each truck for the transportation system consists of a depot, a fixed area and two types of luggages, called schedule problem. The main purpose of this paper is to propose a procedure for the problem subject to keep the balance of work loads among truck drivers. The procedure is based on 3 (heuristic) rules for replacing the addresses of each luggage with the ‘conventional address,’ converting size of each luggage into ‘weight’ and introducing a measure to keep the balance of work loads. The procedure consists of three stages according to ‘priority’ of the types of luggages. A case study is presented that demonstrate the practical usefulness of the procedure.  相似文献   

9.
Designers in general have used diagrams and sketches to help in the process of creation. This is particularly so for system designers whose output is a set of programs. It would seem reasonable that the conversion of diagrams directly into a program would be desirable and yet the work of Green and Petre [3–4, 13] and Citrin [2] has placed doubt on the viability of graphical programming notations. Some of this work is reviewed in this paper. The use of secondary notation and the matc–mismatch hypothesis is reconsidered in the light of functional programming. It is proposed that much of the criticism of graphical notation is due to the imperative (or process orientated) nature of programming. Many of the limitations observed in using graphical notation are lifted when functional programming is used. Eight engineering dimensions and four engineering relationships (coherences) are proposed to describe programming environments (including notation). The source of ‘knotty structures’ is identified as embedded ‘if then else’ or ‘if’ statements. On analysing both imperative and functional programs it was found that imperative programs used an order of magnitude larger number of ‘if’s than functional programs. The key to the success of a functional language as a general representation as well as its coherence with a graphical notation comes from its unique extensibility. Support for these arguments is drawn from examples of a schematic programming language used for industrial scale projects. It is concluded that the marriage between a functional language and its graphical representation overcomes most of the original criticisms of graphical programming. It is demonstrated that this combination makes a viable and expressive tool for industrial-sized applications.  相似文献   

10.
A technique for the selection of the best set of test features for checkout or go-no-go test of a complex electro-hydraulic servo system from input-output measurements is presented. The first step is to establish checkout tolerance bands on the system response. Then the measurement set based on gain and phase which would best discriminate between the ‘healthy’ and ‘sick’ systems is selected from an initially large set of sampled frequencies via an optimization procedure in which the feature efficiency vector is introduced to add or discard features until a satisfactory set is obtained. Finally the selected feature sets are assessed for effectiveness via a ‘goodness’ criterion.  相似文献   

11.
Animation and simulation processes are facilitated by the use of high level graphic languages. The results of these processes are not generally available in real time, developing of microfilm delaying the screening of the process until some time after the computer run.A technique is described which overcomes this problem whilst still allowing the use of a high level graphical language.The addition of a single feature to a ‘static’ graphical language has transformed it into a ‘dynamic’ graphical language allowing real time illustration of time varying processes.The technique is not restricted to the language described but may well be employed by other high level graphical languages.  相似文献   

12.
The European automotive industry requires frequent interaction and transfer of data between geographically dispersed designers and engineers at all stages of the product introduction process. The RACE CAR project identified and demonstrated Integrated Broadband Communications (IBC)-supported applications to support this process and improve competitiveness. User requirements for workstation-based, multi-media facilities including conferencing were identified. Two experiments were designed to investigate the role of face-to-face video and the means by which participants organise and control their interactions. These are critical issues in the multi-cultural, international environment of the European automotive industry. In the first experiment groups of three users solved a cooperative, screen-based, object manipulation task supported by different levels of communication. ‘Linked computers plus an audio link’ resulted in significantly faster completion times than either ‘audio alone’ or ‘linked computer plus audio and face-to-face video’. ‘Linked computers plus audio’ was also perceived as the most effective communications media. The passage of cursor via verbal agreement was successfully managed. Video was generally considered beneficial for initial introductions, assessing understanding and facilitating a stronger feeling of group identity.

In the second experiment, subjects were grouped under ‘chaired’ or ‘free-for-all’ conditions and linked via (1) audio and linked computers or (2) audio, linked computers and face-to-face video. The task was similar to Experiment 1 and attempts to introduce contention were made through adding hidden, sub-goals. The task took significantly less time to complete in the ‘video chaired’ condition than the ‘non-video chaired’ or ‘video free-for-all’ conditions. This suggests that video has an important role in enabling a chairperson to control the meeting. Contention was not successfully achieved.

The results of the experiments suggest face-to-face video may be useful in chaired meetings and to develop ‘team’ feeling. A free-for-all method of control passing was seen as most appropriate although problems in achieving contention in Experiment 2 meant the impact of disagreement was not fully investigated. The results are discussed in relation to the European automotive industry and areas for further study identified. Relevance to industry

The European automotive industry, which maintains distinct engineering functions in disparate countries, is striving to reduce the length of its design life cycle by improving communications between designers and engineers. The studies described in this paper provide information of use to the developers and procurers of systems intended to support this process. In particular issues relating to the relevance of face-to-face video and use of control mechanisms for co-operative computer-mediated work.  相似文献   


13.
A knowledge base containing incomplete information in the form of disjunctions and negative information shows difficulties regarding the update operators. In this paper simple and straightforward definitions are given for an ‘adding’ operator (‘+’) and a ‘removing’ operator (‘−’) using Hebrand models.  相似文献   

14.
Buried stormwater pipe networks play a key role in surface drainage systems for urban areas of Australia. The pipe networks are designed to convey water from rainfall and surface runoff only and do not transport sewage. The deterioration of stormwater pipes is commonly graded into structural and serviceability condition using CCTV inspection data in order to recognize two different deterioration processes and consequences. This study investigated the application of neural networks modelling (NNM) in predicting serviceability deterioration that is associated with reductions of pipe diameter until a complete blockage. The outcomes of the NNM are predictive serviceability condition for individual pipes, which is essential for planning proactive maintenance programs, and ranking of pipe factors that potentially contribute to the serviceability deterioration. In this study the Bayesian weight estimation using Markov Chain Monte Carlo simulation was used for calibrating the NNM on a case study in order to account for the uncertainty often encountered in NNM's calibration using conventional back-propagation weight estimation. The performance and the ranked factors obtained from the NNM were also compared against a classical model using multiple discrimination analysis (MDA). The results showed that the predictive performance of the NNM using Bayesian weight estimation is better than that of the NNM using conventional backpropagation and MDA model. Furthermore, among nine input factors, ‘pipe age’ and ‘location’ appeared insignificant whilst ‘pipe size’, ‘slope’, ‘the number of trees’ and ‘climatic condition’ were found consistently important over both models for serviceability deterioration process. The remaining three factors namely, ‘structure’, ‘soil’ and ‘buried depth’ might be redundant factors. A better and more consistent data collection regime may help to improve the predictive performance of the NNM and identify the significant factors.  相似文献   

15.
A new method of recording and analysing gross body movements has been developed for use in the working environment. The system uses a portable, hand-held micro-computer to create a permanent, real time based, record on cassette tape, with a larger personal computer for the subsequent analysis.

This new system of analysis is able to show the frequency and the duration of those activities relevant to spinal loading and, therefore, possible musculo-skeletal problems. The reliability results are encouraging and ways of improving the system as used are discussed. The method of data collection worked in ‘real time’, proved easy to learn and subsequent analysis took very little time.  相似文献   


16.
This paper provides a new answer to the old problem of specifying the mixed partial derivatives (MPDs) or ‘twist vectors’ at the grid points for an interpolating surface over a rectangular network of curves. An algorithm is presented for finding the MPDs that minimizes a generalized energy integral over the entire surface. The integrand may be any quadratic form in the second partial derivatives of the surface. This results in a surface design technique for interpolating over a network of curves by automatically selecting the optimal twist vectors at the grid points.  相似文献   

17.
It is argued that the backpropagation learning algorithm is unsuited to tackling real world problems such as sensory-motor coordination learning or the encoding of large amounts of background knowledge in neural networks. One difficulty in the real world - the unavailability of ‘teachers’ who already know the solution to problems, may be overcome by the use of reinforcement learning algorithms in place of backpropagation. It is suggested that the complexity of search space in real world neural network learning problems may be reduced if learning is divided into two components. One component is concerned with abstracting structure from the environment and hence with developing representations of stimuli. The other component involves associating and refining these representations on the basis of feedback from the environment. Time-dependent learning problems are also considered in this hybrid framework. Finally, an ‘open systems’ approach in which subsets of a network may adapt independently on the basis of spatio-temporal patterns is briefly discussed.  相似文献   

18.
In this work an intelligent system pertaining to sheet metal part and process design has been developed, storing knowledge and prescribing ways to use this knowledge according to the ‘programming in logic’ paradigm. The sheet metal parts covered by the software are those having U shape and being manufactured by bending (folding), cutting and piercing with particular emphasis on progressive dies. The use envisaged and corresponding parts of the system are: checking the part design for manufacturability, planning process phases, and checking the configuration of press tools involved. Particular attention is paid to the presentation of knowledge that has been gathered from handbooks and verified / enhanced in industry. This is first presented in natural language and then its formal representation in Prolog is described and explained by examples. Part design and press tool checking knowledge is relatively straightforward to represent and structure ‘linearly’. Process planning knowledge is based on patterns that are captured in lists and activated in a case-by-case fashion exploiting the power of Prolog. Validation of the system was conducted using examples from industry.  相似文献   

19.
The VPP system is a multi-vector processor system which mainly aims at effective satellite image processing. It consists of up to 64 element processors (PUs), an S-D loop network, and an image memory. The PUs can execute flexible vector processing by a new vector access method, ‘Index-set’. The S-D loop network achieves high-speed and contention-free data transfer among the PUs. With these components, a new method for parallel processing, ‘Processor Pipeline’, can be realized on the VPP system.  相似文献   

20.
The Bounded Derivative Network (BDN), the analytical integral of a neural network, is a natural and elegant evolution of universal approximating technology for use in automatic control schemes. This modeling approach circumvents the many real problems associated with standard neural networks in control such as model saturation (zero gain), arbitrary model gain inversion, ‘black box’ representation and inability to interpolate sensibly in regions of sparse excitation. Although extrapolation is typically not an advantage unless the understanding of the process is complete, the BDN can incorporate process knowledge in order that its extrapolation capability is inherently sensible in areas of data sparsity. This ability to impart process knowledge on the BDN model enables it to be safely incorporated into a model based control scheme.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号