首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We describe tools for automatic identification and classification of diatoms that compare photographs with other photographs and drawings, via a model. Identification of diatoms, i.e. assigning a new specimen to one of the known species, has applications in many disciplines, including ecology, palaeoecology and forensic science. The model we build represents life cycle and natural variation of both shape and texture over multiple diatom species, derived automatically from photographs and/or drawings. The model can be used to automatically produce drawings of diatoms at any stage of their life cycle development. Similar drawings are traditionally used for diatom identification, and encapsulate visually salient diatom features. In this article, we describe the methods used for analysis of photographs and drawings, present our model of diatom shape and texture variation, and finish with results of identification experiments using photographs and drawings as well as a detailed evaluation.  相似文献   

2.
3.
The purpose of this study is to apply shape optimization tools for design of resistance welding electrodes. The numerical simulation of the welding process has been performed by a simplified FEM model implemented in COMSOL. The design process is formulated as an optimization problem where the objective is to prolong the life-time of the electrodes. Welding parameters like current, time and electrode shape parameters are selected to be the design variables while constraints are chosen to ensure a high quality of the welding. Surrogate models based on a Kriging approximation has been used in order to simplify the calculation of shape sensitivities and to generate a generic tool that can be interfaced with other simulation tools. An example numerical study shows the potential of applying optimal design techniques in this area. Part of this work was presented at WCSMO7 in Seoul Korea, May 21–25, 2007, in the paper titled ‘Some optimization aspects of resistance welding’ (CD-ROM, pp 2687–2695).  相似文献   

4.
This paper addresses in an integrated and systematic fashion the relatively overlooked but increasingly important issue of measuring and characterizing the geometrical properties of nerve cells and structures, an area often called neuromorphology. After discussing the main motivation for such an endeavour, a comprehensive mathematical framework for characterizing neural shapes, capable of expressing variations over time, is presented and used to underline the main issues in neuromorphology. Three particularly powerful and versatile families of neuromorphological approaches, including differential measures, symmetry axes/skeletons, and complexity, are presented and their respective potentials for applications in neuroscience are identified. Examples of applications of such measures are provided based on experimental investigations related to automated dendrogram extraction, mental retardation characterization, and axon growth analysis.  相似文献   

5.
The dimensional variations of components or detail parts in a complex assembly could be within the acceptable specifications, but when put together, the proper fit of the final assembly can not be obtained. It becomes difficult to locate the root cause of the assembly misfit. The principal component analysis, a multi-variant statistical method, can define the position of the principal variables or the major parameters which signify the misfit condition. Depending on the correlation strength of the original variables, it is possible to present the variation characteristics of 20 or 30 variables by only two or three principal components. The method is simple to use, and could save time and money by accelerating the identification of the assembly problems.  相似文献   

6.
7.
A new graphical tool (Multimedia University’s RSIMANA—Remote-Sensing Image Analyzer) developed for image analysis is described in this paper. MATLAB and ENVI are some of the commercially available tools in the market that aid in image processing and analysis. But their current versions are of limited assistance in image analysis; for example, MATLAB can extract the area of irregular objects and patterns in images, but not their length. ENVI is more focused on image processing than on image analysis functions. Other commercially available tools are also prohibitively expensive. This indicates the need to develop a userfriendly graphical tool that meets research objectives in the educational environment. The text was submitted by the author in English. Hema Nair. Born 1965. Educational qualifications: B.Tech. (Electrical Engineering) from Government Engineering College affiliated to University of Calicut, Kerala State, India, 1986; MSc (Electrical Engineering) from National University of Singapore, 1993; MSc (Computer Science) from Clark Atlanta University, USA, 1996. Previous employment: Researcher and Project Leader in AT & T, New Jersey, USA, for about 5 years. Also worked in Bangalore, India, before that in Apple Information Technology Ltd. as Teaching Faculty. Current employment: lecturer, Faculty of Engineering and Technology, Multimedia University, Malaysia. Current research: the final stages of her PhD in Computer Science at Multimedia University. Scientific interests are image analysis, pattern recognition, databases, AI, data mining. Member of IEEE (USA) since 1997, Professional Member of ACM (USA) since 1997, Member of Institution of Engineers (India) since 1986. Reviewer for IASTED International Conference 2004. Current PhD project entitled “Pattern Extraction and Concept Clustering in Linguistic Terms from Mined Images” is funded by an Intensive Research in Priority Area (IRPA) grant from Government of Malaysia. Research for MSc in Computer Science from USA was funded by a research grant from the US Army. Author of three International Conference papers accepted in Portugal, Belgium, and India.  相似文献   

8.
In a distributed development environment, the display and analysis of project data are complicated by heterogeneous environments. The authors discuss WebME visualization tool that gathers disparate development data collected from distributed environments and displays them using Web technology  相似文献   

9.
《Software, IEEE》1992,9(1):47-54
GMA, a generic graphical modeling and analysis package that satisfies both commercial users and performance evaluation researchers is described. Using a graphical interface, both novices and experts can analyze the performance of diverse networks simply by loading a configuration file and traffic data. GMA accommodates a variety of networks and modeling techniques. Its structure and use are described, and a simple four-node network is analyzed as an example  相似文献   

10.
Pattern Analysis and Applications - This paper presents a novel method for handling the effects of shape outliers in statistical shape analysis. Usually performed by a variant of classical...  相似文献   

11.
12.
Program mutation is a fault-based technique for measuring the effectiveness of test cases that, although powerful, is computationally expensive. The principal expense of mutation is that many faulty versions of the program under test, called mutants, must be created and repeatedly executed. This paper describes a tool, called JavaMut, that implements 26 traditional and object-oriented mutation operators for supporting mutation analysis of Java programs. The current version of that tool is based on syntactic analysis and reflection for implementing mutation operators. JavaMut is interactive; it provides a graphical user interface to make mutation analysis faster and less painful. Thanks to such automated tools, mutation analysis should be achieved within reasonable costs.  相似文献   

13.
This paper introduces a new benchmark study to evaluate the performance of landmark-based shape correspondence used for statistical shape analysis. Different from previous shape-correspondence evaluation methods, the proposed benchmark first generates a large set of synthetic shape instances by randomly sampling a given statistical shape model that defines a ground-truth shape space. We then run a test shape-correspondence algorithm on these synthetic shape instances to identify a set of corresponded landmarks. According to the identified corresponded landmarks, we construct a new statistical shape model, which defines a new shape space. We finally compare this new shape space against the ground-truth shape space to determine the performance of the test shape-correspondence algorithm. In this paper, we introduce three new performance measures that are landmark independent to quantify the difference between the ground-truth and the newly derived shape spaces. By introducing a ground-truth shape space that is defined by a statistical shape model and three new landmark-independent performance measures, we believe the proposed benchmark allows for a more objective evaluation of shape correspondence than previous methods. In this paper, we focus on developing the proposed benchmark for $2$D shape correspondence. However it can be easily extended to $3$D cases.  相似文献   

14.
15.
Recent advances in modeling tools enable non‐expert users to synthesize novel shapes by assembling parts extracted from model databases. A major challenge for these tools is to provide users with relevant parts, which is especially difficult for large repositories with significant geometric variations. In this paper we analyze unorganized collections of 3D models to facilitate explorative shape synthesis by providing high‐level feedback of possible synthesizable shapes. By jointly analyzing arrangements and shapes of parts across models, we hierarchically embed the models into low‐dimensional spaces. The user can then use the parameterization to explore the existing models by clicking in different areas or by selecting groups to zoom on specific shape clusters. More importantly, any point in the embedded space can be lifted to an arrangement of parts to provide an abstracted view of possible shape variations. The abstraction can further be realized by appropriately deforming parts from neighboring models to produce synthesized geometry. Our experiments show that users can rapidly generate plausible and diverse shapes using our system, which also performs favorably with respect to previous modeling tools.  相似文献   

16.
Most private and public organizations have recently turned their attention to the process by which they operate, to improve service and product quality and customer satisfaction. To support business process reengineering, methods and tools for process modeling and analysis are required. The paper presents the ARTEMIS methodology and associated tool environment for business process analysis for reengineering. In the ARTEMIS methodological framework, business processes are modeled as workflows and are analyzed according to an organizational structure perspective and an operational structure perspective. With these two perspectives, the analyst can plan reengineering interventions based on the degree of autonomy/dependency of organization units in terms of coupling, and the inter-process semantic correspondences, in terms of data and operation similarity, respectively. The ARTEMIS methodology and associated tool environment have been conceived and applied in the framework of the PROGRESS research project. In the paper, we report on a reengineering case study of this project involving the Italian Ministry of Justice.  相似文献   

17.
Pervasive systems are large-scale systems consisting of many sensors capturing numerous types of information. As this data is highly voluminous and dimensional, data analysis tasks can be extremely cumbersome and time-consuming. Enabling computers to recognise real-world situations is an even more difficult problem, involving not only data analysis, but also consistency checking. Here we present Situvis, an interactive visualisation tool for representing sensor data and creating higher-level abstractions from the data. This paper builds on previous work, Clear et al. (2009) [8] through evolved tool functionality and an evaluation of Situvis. A user-trial consisting of 10 participants shows that Situvis can be used to complete the key tasks in the development process of situation specifications in over 50% less time than an improvised alternative toolset.  相似文献   

18.
Domain analysis is the process of identifying and documenting common and variable characteristics of systems in a specific domain. This process is a large and complex one, involving many interrelated activities, making it essential to have a tool support for aiding the process. We present a domain analysis tool called ToolDAy that has the purpose of making the process semi-automatic. The requirements definition presented were based on the results of a systematic review that analyzed several existing tools. Furthermore, this article describes the tool architecture, implementation and its evaluations (two as a controlled experiment and one as an industrial case study) with three different domains. The results of these evaluations indicate that the tool can aid the domain analyst to achieve systematic reuse in an effective way.  相似文献   

19.
The analysis of an assembly process in the discrete part manufacturing industry usually involves a large number of dimensions. Each dimension tolerance plays a different role with respect to the dimension variation of the final product, which is the so-called “quality”(dimension precision). Furthermore, there are many random factors (noises) present during the assembly operations (tool wear, loose fixture, etc.). Few mathematical models can represent the assembly process. Therefore, computer simulation has been employed. Variation Simulation Modeling uses the Monte Carlo sampling technique to simulate the assembly process. By applying a geometric standard and simulating the physical operations, the statistics of the final product dimensions can be predicted.

With the simulation results, statistical analysis is essential to identifying the critical factors (component dimensions). The traditional experimental designs, such as full factorial design, however, are not practical since the number of factors is too large. Taguchi method, which explores a special subset of factor combinations (called the orthogonal array) is able to examine a large number of factors (and interactions) in a much smaller number of experiments. The analysis of variance is performed to ensure the proper selection of significant factors. With this proposed unified tool, engineering understanding and judgement become more effective in making appropriate decisions regarding the product and process designs.  相似文献   


20.
Objective: Information Retrieval (IR) is strongly rooted in experimentation where new and better ways to measure and interpret the behavior of a system are key to scientific advancement. This paper presents an innovative visualization environment: Visual Information Retrieval Tool for Upfront Evaluation (VIRTUE), which eases and makes more effective the experimental evaluation process.Methods: VIRTUE supports and improves performance analysis and failure analysis.Performance analysis: VIRTUE offers interactive visualizations based on well-known IR metrics allowing us to explore system performances and to easily grasp the main problems of the system.Failure analysis: VIRTUE develops visual features and interaction, allowing researchers and developers to easily spot critical regions of a ranking and grasp possible causes of a failure.Results: VIRTUE was validated through a user study involving IR experts. The study reports on (a) the scientific relevance and innovation and (b) the comprehensibility and efficacy of the visualizations.Conclusion: VIRTUE eases the interaction with experimental results, supports users in the evaluation process and reduces the user effort.Practice: VIRTUE will be used by IR analysts to analyze and understand experimental results.Implications: VIRTUE improves the state-of-the-art in the evaluation practice and integrates visualization and IR research fields in an innovative way.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号

京公网安备 11010802026262号