首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.
针对白盒测试的单元测试的技术开发,运用Echpse平台上的JUnit单元测试框架,编写出正确的单元测试用例,针对程序模块(软件设计的最小单位)来进行正确性检验的测试工作。通过实现测试自动化,进一步提高软件质量。  相似文献   

2.
《硅谷》2011,(16):37-37
1.JunitJUnit是一个Java语言的单元测试框架。它由Kent Beck和Erich Gamma建立,逐渐成为源于Kent Beck的sUnit的xUnit家族中为最成功的一个。JUnit有它自己的  相似文献   

3.
鉴于声纳嵌入式软件的特点.实时和多任务处理,嵌入式软件测试有其自身的特点和难点。作者根据软件测试理论,结合XX型号测试实践,重点介绍单元测试的理论和方法,提出了,如何使用测试工具进行结构覆盖率测试。  相似文献   

4.
本文通过一个智能测试系统为开发实例,论述了可控图形界面在复杂测试系统中的重要作用,文中具体介绍了一种可控图形界面的实现技术。  相似文献   

5.
Mark Michaelis 《硅谷》2005,(6):123-129
简介 最新发布的Visual Studio Test System(VSTS)包含了一套用于Visual Studio Team Test的完整功能。Team Test是Visual Studio集成的单元测试框架,它支持:  相似文献   

6.
给出了一种对某直升机自动驾驶仪进行闭环测试的新方法。基于MATLAB的Real Time Windows Target 工具箱可实现对系统硬件资源进行操作的功能,文中采用了数据采集卡、数字-自整角机等元件构建了硬件测试环境,同时在Simu Link工具箱中建立了系统模型,并利用MATLAB的GUI工具编制了用户界面。从而在使用MATLAB开发测试系统时,除了利用其数据处理和图形显示的功能外,还可以使用其对硬件操作的功能。通过对实时测试结果的分析,验证了该方法的可行性和方便性。  相似文献   

7.
不论采用什么技术和方法,软件中仍然会有错。软件测试的目的是发现程序中的错误,是为了证明程序有错,而不是证明程序无错。软件测试的目的之一就是尽可能早地发现软件中存在的错误,所以单元测试很重要。重点论述单元测试中最常用到的静态测试技术,也论述黑盒测试与白盒测试等动态测试技术的运用。  相似文献   

8.
迷宫电脑鼠的概念最早由IEEE Spectrum杂志在七十年代提出,随后这项比赛先后在英国、美国、日本、新加坡等国家开展,现在已成为世界级的电子机器人竞赛。电脑鼠是一个能够自主搜索一个16×16单元格的迷宫中心机器人小车。本文总结了常见的搜索算法,吸收现有算法的思想,提出一个适用于电脑鼠竞赛的实用算法。为了测试所设计的算法的性能,用Visual Studio制作测试环境,编程实现上述算法,设置了数个正式比赛使用的迷宫来进行算法测试,并且借助图形界面展示算法仿真的效果,测试结果表明算法具有较好的实时性和鲁棒性。  相似文献   

9.
曹斌 《硅谷》2014,(2):27-29
软件测试是软件质量保障的基础,而单元测试是软件测试的重要阶段,单元测试用例的设计是软件测试的重要环节。文章重点结合xx型号嵌入式星载软件的一个模块,详细介绍并论述单元测试的方法。  相似文献   

10.
图形符号在数字图形界面中的应用和设计分析   总被引:3,自引:3,他引:0  
何丽萍 《包装工程》2011,32(2):22-25
以图形符号在日常生活中的使用为例,分析了图形符号相较与文字的优势,由此引申出图形符号在数字图形界面中的使用价值,并列举分析了各类数字图形界面中图形符号的不同设计特点,同时还列举了一些数字图形界面中图形符号设计的成功实例和使用误区。在此基础上提出图形符号设计的基本原则和方法,总结了数字图形界面中图形符号的最终设计目标。  相似文献   

11.
Motivated by applications to root-cause identification of faults in multistage manufacturing processes that involve a large number of tools or equipment at each stage, we consider multiple testing in regression models whose outputs represent the quality characteristics of a multistage manufacturing process. Because of the large number of input variables that correspond to the tools or equipments used, this falls in the framework of regression modeling in the modern era of big data. On the other hand, with quick fault detection and diagnosis followed by tool rectification, sparsity can be assumed in the regression model. We introduce a new approach to address the multiple testing problem and demonstrate its advantages over existing methods. We also illustrate its performance in an application to semiconductor wafer fabrication that motivated this development. Supplementary materials for this article are available online.  相似文献   

12.
Environmental modeling is an important tool for understanding and managing complex environmental systems. Regardless of discipline, complete modeling includes a number of steps, ranging from conceptualization to application. However, modeling courses often focus on just one, or at most a few, of the steps and frequently are assignment‐driven. Moreover, they often present numerical procedures as recipes, without regard to theory or limitations and without regard to “real‐world” application. In this course, we unify the modeling sequence by exposing students to the full spectrum of modeling (mathematics, physics, and numerical methods) in the context of surfactant‐enhanced remediation of contaminated soils, a technology being developed at the University of Oklahoma. An innovative course structure is used that couples team learning with a project‐driven syllabus (also referred to as just‐in‐time learning) and combines mathematical and physical modeling via data from laboratory and field testing. Other thematic areas can easily be developed in the same framework. We believe the course pedagogy is highly portable and can serve as an example for any modeling course or for many other courses in an engineering curriculum.  相似文献   

13.
The use of fiber optics in in vitro dissolution testing opens up new possibilities for more powerful data evaluation since an entire UV-Vis spectrum can be collected at each measuring point. This paper illustrates a multivariate chemometric approach to the solution of problems of interfering absorbance of excipients in in vitro dissolution testing. Two different chemometric approaches are tested: multivariate calibration using partial least squares (PLS) regression and curve resolution using multivariate curve resolution alternating least squares (MCR-ALS), generalized rank annihilation (GRAM), and parallel factor analysis (PARAFAC). Multivariate calibration (PLS) can, following the construction of a calibration model from a calibration sample set, give selective and accurate determinations of the active ingredient in dissolution testing despite the presence of interfering absorbance from excipients. Curve resolution (MCR-ALS, GRAM, or PARAFAC) can be applied to dissolution testing data in order to determine the dissolution rate profiles and spectra for the interfering excipients as well as for the active ingredient without any precalibration. The concept of the application of these chemometric methods to fiber-optic dissolution testing data is exemplified by analysis of glibenclamide tablets enclosed in hard gelatin capsules. The results show that, despite highly overlapping spectra and unresolved raw data, it is possible with PLS to obtain an accurate dissolution rate profile of glibenclamide. Applying curve resolution makes it possible to obtain accurate estimates of both dissolution rate profiles and spectra of both the gelatin capsule and the glibenclamide. The application of multivariate chemometric methods to fiber-optic dissolution testing brings a fresh scope for a deeper understanding of in vitro dissolution testing, solving the problem of interfering absorbance of excipients and making it possible to obtain dissolution rate profiles and spectra of these. Obtaining dissolution rate profiles of multiple active pharmaceutical ingredients in tablets consisting of several active compounds is another possibility.  相似文献   

14.
The application of moment methods (MMs) to eddy-current testing problems for nondestructive evaluation (NDE) is considered. A general formulation for the MM that can be used to analyze NDE problems is derived, and calculated results and experimental data obtained from eddy-current testing of an artificially made sample are presented. Good agreement between the calculated results and the experimental data confirms the validity of the method and shows that the MM can be used as an alternative to the finite-element method (FEM) and the boundary-element method (BEM) in NDE  相似文献   

15.
A sound methodology for the elicitation of subjective expert judgement is a pre-requisite for specifying prior distributions for the parameters of reliability growth models. In this paper, we describe an elicitation process that is developed to ensure valid data are collected by suggesting how possible bias might be identified and managed. As well as discussing the theory underpinning the elicitation process, the paper gives practical guidance concerning its implementation during reliability growth testing. The collection of subjective data using the proposed elicitation process is embedded within a Bayesian reliability growth modelling framework and reflections upon its practical use are described.  相似文献   

16.
Data privacy laws require service providers to inform their customers on how user data is gathered, used, protected, and shared. The General Data Protection Regulation (GDPR) is a legal framework that provides guidelines for collecting and processing personal information from individuals. Service providers use privacy policies to outline the ways an organization captures, retains, analyzes, and shares customers’ data with other parties. These policies are complex and written using legal jargon; therefore, users rarely read them before accepting them. There exist a number of approaches to automating the task of summarizing privacy policies and assigning risk levels. Most of the existing approaches are not GDPR compliant and use manual annotation/labeling of the privacy text to assign risk level, which is time-consuming and costly. We present a framework that helps users see not only data practice policy compliance with GDPR but also the risk levels to privacy associated with accepting that policy. The main contribution of our approach is eliminating the overhead cost of manual annotation by using the most frequent words in each category to create word-bags, which are used with Regular Expressions and Pointwise Mutual Information scores to assign risk levels that comply with the GDPR guidelines for data protection. We have also developed a web-based application to graphically display risk level reports for any given online privacy policy. Results show that our approach is not only consistent with GDPR but performs better than existing approaches by successfully assigning risk levels with 95.1% accuracy after assigning data practice categories with an accuracy rate of 79%.  相似文献   

17.
Though the analysis of variance is a commonly applied method for testing for differences between means of several processes, it is based in part on the assumption that the processes give rise to output that is normally distributed on the measured variable. Reliability and life test studies frequently give birth to data that exhibit clear skew, and application of the analysis of variance is questionable in such cases. A method referred to as analysis of reciprocals, which is based on an assumed inverse Gaussian distribution, provides an alternative to the analysis of variance in these instances. With applications in a variety of functional areas, including reliability and life testing, the inverse Gaussian distribution is able to accommodate substantial skew. It is hoped that this exposition will increase awareness of both the inverse Gaussian distribution and data analysis methods that are based on this distribution.  相似文献   

18.
Performance measurement systems (PMS) have commonly been applied to evaluate and reward performances at managerial levels, especially in the context of supply chain management. However, evidence suggests that the effective use of PMS can also positively influence the behaviour and improve performance at an operational level. The motivation is to accomplish organisational goals, namely to increase supply chain flexibility by responding to evermore-varying customer demands in a timely manner. The purpose of the study described in this paper was to develop a conceptual framework that adopts performance measures for ex-ante decision-making at an operational level within the supply chain. To guide the research, five questions were asked and subsequently key gaps have been identified. In an attempt to fill the gaps, a case study at a major global brand beverage company has been carried out, and as a result, a conceptual framework of the PMS has been developed. Overall, the research offers a foundation of the applicability and impact of PMS in the supply chain and provides a framework that attends to some of the potential uses of PMS that so far have not been practically applied. The outcomes from the testing indicate that the initial gaps identified in the literature have been addressed and that the framework is judicious with scope for practical applicability. The framework is deemed worthy of further testing in different operational contexts of the supply chain.  相似文献   

19.
嵌入式软件是软件中最难测试的一种软件,该文提出了一种嵌入式软件可靠性测试环境框架,分析了嵌入式软件测试与普通软件测试的异同。针对嵌入式软件难于测试的问题,运用仿真测试和自动化测试方法,并利用TestQuest工具辅助构建环境,重点介绍了框架中各个模块的功能、实时控制器的分层结构实现、上位机与实时控制器的通信协议编写方法,最后通过实例验证该平台有良好的适应性和易操作性。  相似文献   

20.
使用图形处理器(GPU)代替传统中央处理器(CPU)作为数值求解硬件,建立基于LABVIEW?MATLAB?GPU的实时子结构试验架构.以土?结相互作用系统为载体,通过数值仿真与试验对该架构的性能进行验证.试验与仿真结果表明,本文方法将实时子结构试验中数值子结构求解自由度提高到27000,提升了数值模型求解规模,拓展了...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号