首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In 1995, Watts Humphrey introduced the Personal Software Process in his book, A Discipline for Software Engineering (Addison Wesley Longman, Reading, Mass.). Programmers who use the PSP gather measurements related to their own work products and the process by which they were developed, then use these measures to drive changes to their development behavior. The PSP focuses on defect reduction and estimation improvement as the two primary goals of personal process improvement. Through individual collection and analysis of personal data, the PSP shows how individuals can implement empirically guided software process improvement. The full PSP curriculum leads practitioners through a sequence of seven personal processes. The first and most simple PSP process, PSPO, requires that practitioners track time and defect data using a Time Recording Log and Defect Recording Log, then fill out a detailed Project Summary Report. Later processes become more complicated, introducing size and time estimation, scheduling, and quality management practices such as defect density prediction and cost-of-quality analyses. After almost three years of teaching and using the PSP, we have experienced its educational benefits. As researchers, however, we have also uncovered evidence of certain limitations. We believe that awareness of these limitations can help improve appropriate adoption and evaluation of the method by industrial and academic practitioners  相似文献   

2.
This paper presents a case study examining the implementation of a change from ‘independent Software Quality Assurance testing’ to ‘cooperative testing’ and the resulting impact on the software life cycle. The same test tools have been used, so that any change can be primarily attributed to the process change. In the new process, bugs are caught and addressed earlier in the cycle. Additionally, early integration of the systematic test process coupled with a change in ownership, improved infrastructure and a more formal auditing of test plan execution has produced higher quality software with a more predictable release schedule. The new process has resulted in a higher rate of bug detection and correction during testing, as well as fewer bug reports from the field. © 1998 John Wiley & Sons, Ltd.  相似文献   

3.
Tony Gorschek  Claes Wohlin 《Software》2004,34(14):1311-1344
Software process improvement is a challenge in general and in particular for small‐ and medium‐sized companies. Assessment is one important step in improvement. However, given that a list of improvement issues has been derived, it is often very important to be able to prioritize the improvement proposals and also look at the potential dependencies between them. This paper comes from an industrial need to enable prioritization of improvement proposals and to identify their dependencies. The need was identified in a small‐ and medium‐sized software development company. Based on the need, a method for prioritization and identification of dependencies of improvement proposals was developed. The prioritization part of the method is based on a multi‐decision criteria method and the dependencies are identified using a dependency graph. The developed method has been successfully applied in the company, where people with different roles applied the method. The paper presents both the method as such and the successful application of it. It is concluded that the method worked as a means for prioritization and identification of dependencies. Moreover, the method also allowed the employees to discuss and reason about the improvement actions to be taken in a structured and systematic way. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

4.
Drake  T. 《Computer》1996,29(11):78-87
The National Security Agency's (NSA) mission is to provide support for the security of the United States. Over the years, the Agency has become extremely dependent on the software that makes up its information technology infrastructure. NSA has come to view software as a critical resource upon which much of the world's security, prosperity, and economic competitiveness increasingly rests. To ensure cost effective delivery of high quality software, NSA has analyzed effective quality measures applied to a sample code base of 25 million lines. This case study dramatically illustrates the benefits of code level measurement activities  相似文献   

5.
Boehm  B. Li Guo Huang 《Computer》2003,36(3):33-41
The information technology field's accelerating rate of change makes feedback control essential for organizations to sense, evaluate, and adapt to changing value propositions in their competitive marketplace. Although traditional project feedback control mechanisms can manage the development efficiency of stable projects in well-established value situations, they do little to address the project's actual value, and can lead to wasteful misuse of an organization's scarce resources. The value-based approach to software development integrates value considerations into current and emerging software engineering principles and practices, while developing an overall framework in which these techniques compatibly reinforce each other.  相似文献   

6.
This paper describes the creation and initial implementation of a measurement programme to Eastman Kodak's Printer Products Division software engineering process. This measurement strategy is being applied to all appropriate programmes within the division. This case study describes one project where a simple set of metrics have improved the quality of our software products and the process used to develop the software. It describes the steps taken to create and implement, and the successes and problems encountered during software development.  相似文献   

7.
Software evolution studies have traditionally focused on individual products. In this study we scale up the idea of software evolution by considering software compilations composed of a large quantity of independently developed products, engineered to work together. With the success of libre (free, open source) software, these compilations have become common in the form of ‘software distributions’, which group hundreds or thousands of software applications and libraries into an integrated system. We have performed an exploratory case study on one of them, Debian GNU/Linux, finding some significant results. First, Debian has been doubling in size every 2 years, totalling about 300 million lines of code as of 2007. Second, the mean size of packages has remained stable over time. Third, the number of dependencies between packages has been growing quickly. Finally, while C is still by far the most commonly used programming language for applications, use of the C++, Java, and Python languages have all significantly increased. The study helps not only to understand the evolution of Debian, but also yields insights into the evolution of mature libre software systems in general.
Daniel M. GermanEmail:

Jesus M. Gonzalez-Barahona   teaches and researches in Universidad Rey Juan Carlos, Mostoles (Spain). His research interests include libre software development, with a focus on quantitative and empirical studies, and distributed tools for collaboration in libre software projects. He works in the GSyC/LibreSoft research team, . Gregorio Robles   is Associate Professor at the Universidad Rey Juan Carlos, where he earned his PhD in 2006. His research interests lie in the empirical study of libre software, ranging from technical issues to those related to the human resources of the projects. Martin Michlmayr   has been involved in various free and open source software projects for well over 10 years. He acted as the leader of the Debian project for two years and currently serves on the board of the Open Source Initiative (OSI). Martin works for HP as an Open Source Community Expert and acts as the community manager of FOSSBazaar. Martin holds Master degrees in Philosophy, Psychology and Software Engineering, and earned a PhD from the University of Cambridge. Juan José Amor   has a M.Sc. in Computer Science from the Universidad Politécnica de Madrid and he is currently pursuing a Ph.D. at the Universidad Rey Juan Carlos, where he is also a project manager. His research interests are related to libre software engineering, mainly effort and schedule estimates in libre software projects. Since 1995 he has collaborated in several libre software organizations; he is also co-founder of LuCAS, the best known libre software documentation portal in Spanish, and Hispalinux, the biggest spanish Linux user group. He also collaborates with and Linux+. Daniel M. German   is associate professor of computer science at the University of Victoria, Canada. His main areas of interest are software evolution, open source software engineering and intellectual property.   相似文献   

8.
We present a case study of the use of a software process improvement method which is based on the analysis of defect data. The first step of the method is the classification of software defects using attributes which relate defects to specific process activities. Such classification captures the semantics of the defects in a fashion which is useful for process correction. The second step utilizes a machine-assisted approach to data exploration which allows a project team to discover such knowledge from defect data as is useful for process correction. We show that such analysis of defect data can readily lead a project team to improve their process during development  相似文献   

9.
Aspect-oriented software testing is emerging as an important alternative to conventional procedural and object-oriented testing techniques. This paper reports experiences from two case studies where aspects were used for the testing of embedded software in the context of an industrial application. In the first study, we used code-level aspects for testing non-functional properties. The methodology we used for deriving test aspect code was based on translating high-level requirements into test objectives, which were then implemented using test aspects in AspectC++. In the second study, we used high-level visual scenario-based models for the test specification, test generation, and aspect-based test execution. To specify scenario-based tests, we used a UML2-compliant variant of live sequence charts. To automatically generate test code from the models, a modified version of the S2A Compiler, outputting AspectC++ code, was used. Finally, to examine the results of the tests, we used the Tracer, a prototype tool for model-based trace visualization and exploration. The results of the two case studies show that aspects offer benefits over conventional techniques in the context of testing embedded software; these benefits are discussed in detail. Finally, towards the end of the paper, we also discuss the lessons learned, including the technological and other barriers to the future successful use of aspects in the testing of embedded software in industry.  相似文献   

10.
The extent and complexity of environmental data management needs have increased significantly over the past several years. As environmental regulations increase, and compliance solutions are evaluated, the need to monitor and document performance becomes increasingly important. Regulatory complexities and volumes of monitoring data mandate the use of detailed procedures to assure compliance with accounting for wastes produced and disposed. A commercially available comprehensive environmental data management system offers a solution, particularly in times when personnel availability for in-house software development is limited. The modular ECOTRACtm(1) system enables users to customize a system to their data management needs. Based on dBASE IIItm(2), the system offers the flexibility to meet specific needs without extensive programming or computer knowledge. Standard reports allow consistent and timely reporting to management and regulatory authorities. Case studies demonstrate efficiencies gained through use of commercially available environmental data management software for microcomputers. ECOTRACtm software has proven useful to a variety of industry applications, and has been favorably received by independent technical reviewers.  相似文献   

11.
Financial and costs benefits are often put forward as the reasons why organisations decide to outsource. Emerging patterns and trends indicate that today's outsourcing decisions are often motivated by factors other than cost. Thus, the decision-making process is more complex than it may at first appear. This paper presents findings from a case study from an organisation in the UK banking sector that was motivated to outsource aspects of its information technology/information system (IT/IS). The underlying motives and decision-making process that influenced the bank outsource its IT/IS are presented and discussed. Findings from the case study suggest political perspectives, as well as human and organisational issues influenced the bank's strategic decision-making to outsource certain aspects of its business. An examination of the case study findings suggests that cost alone is not always responsible for decisions to outsource, as it was found the bank's outsourcing decision was driven by a series of complex, interrelated motives in a bid to reduce the risks and uncertainties of managing its own technology. Considering the complex nature of the outsourcing process a frame of reference that can be used to assist managers with their decision to outsource IT/IS is propagated. The case study is used to present an organisation's experiences as to how and why it decided to outsource its IS and thus offers a learning opportunity for other organisations facing similar difficulties. In addition, the case study findings highlight the need to focus greater attention on discriminating between the short and long-term consequences of IT/IS decision-making.  相似文献   

12.
In the software product line research, product variants typically differ by their functionality and quality attributes are not purposefully varied. The goal is to study purposeful performance variability in software product lines, in particular, the motivation to vary performance, and the strategy for realizing performance variability in the product line architecture. The research method was a theory-building case study that was augmented with a systematic literature review. The case was a mobile network base station product line with capacity variability. The data collection, analysis and theorizing were conducted in several stages: the initial case study results were augmented with accounts from the literature. We constructed three theoretical models to explain and characterize performance variability in software product lines: the models aim to be generalizable beyond the single case. The results describe capacity variability in a base station product line. Thereafter, theoretical models of performance variability in software product lines in general are proposed. Performance variability is motivated by customer needs and characteristics, by trade-offs and by varying operating environment constraints. Performance variability can be realized by hardware or software means; moreover, the software can either realize performance differences in an emergent way through impacts from other variability or by utilizing purposeful varying design tactics. The results point out two differences compared with the prevailing literature. Firstly, when the customer needs and characteristics enable price differentiation, performance may be varied even with no trade-offs or production cost differences involved. Secondly, due to the dominance of feature modeling, the literature focuses on the impact management realization. However, performance variability can be realized through purposeful design tactics to downgrade the available software resources and by having more efficient hardware.  相似文献   

13.
The main objective of this paper is to propose a set of indicators for the evaluation of Workflow software-type products within the context of Information Systems. This paper is mainly based on a comprehensive bibliographical review of all topics referring to the Workflow Technology and Information Systems. Next, sets of indicators are presented for the selection of a Workflow software based on the realities of the business world, including a method of examination so as to obtain an integral evaluation on the Workflow software. Finally, the evaluation method for two types of Workflow software is applied: Lotus Domino/Notes® and Microsoft Exchange®, for the billing subsystems of a company called MANAPRO Consultants, Inc.®.  相似文献   

14.
《Information & Management》2001,38(3):185-199
Information system (IS) plans can vary in length and detail. One must, therefore, be able to tailor the existing planning methodologies to produce the desirable outputs. This article proposes a framework of demand-centric adaptive IS planning process and applies it to a case study that demonstrates how to adapt the methodology to produce an IS plan for a small commercial bank. Following the output-driven adaptive approach, the project was completed on time with expected quality. The project document provides the bank’s management with guidelines for allocating their information resources to meet the current and future needs of business.  相似文献   

15.
《Information Systems》2005,30(8):609-629
Although security is a crucial issue for information systems, traditionally, it is considered after the definition of the system. This approach often leads to problems, which most of the times translate into security vulnerabilities. From the viewpoint of the traditional security paradigm, it should be possible to eliminate such problems through better integration of security and software engineering. This paper firstly argues for the need to develop a methodology that considers security as an integral part of the whole system development process, and secondly it contributes to the current state of the art by proposing an approach that considers security concerns as an integral part of the entire system development process and by relating this approach with existing work. The different stages of the approach are described with the aid of a real-life case study; a health and social care information system.  相似文献   

16.
The goal of the GUARDS project is to design and develop a generic fault-tolerant computer architecture that can be built from predefined standardised components. The architecture favours the use of commercial off-the-shelf (COTS) hardware and software components. However, the assessment and selection of COTS components is a non-trivial task as it requires balancing a myriad of requirements from end-users and the preliminary architecture design. In this paper, we present the requirements and assessment criteria for a specific COTS software component, the operating system kernel. As an interface specification constitutes a major compatibility criterion for the selection of COTS components in GUARDS, a particular emphasis is placed on operating system conformance to the POSIX 1003.1 standard. We discuss the general lessons learned from the assessment process and raise a number of questions relevant to the assessment of any COTS software component.  相似文献   

17.
Solving software evaluation problems is a particularly difficult software engineering process and many contradictory criteria must be considered to reach a decision. Nowadays, the way that decision support techniques are applied suffers from a number of severe problems, such as naive interpretation of sophisticated methods and generation of counter-intuitive, and therefore most probably erroneous, results. In this paper we identify some common flaws in decision support for software evaluations. Subsequently, we discuss an integrated solution through which significant improvement may be achieved, based on the Multiple Criteria Decision Aid methodology and the exploitation of packaged software evaluation expertise in the form of an intelligent system. Both common mistakes and the way they are overcome are explained through a real world example.  相似文献   

18.
《Knowledge》2007,20(7):683-693
The Lyee methodology allows the development of a software by simply defining its requirements. More precisely, a developer has only to provide words, calculation formulae, calculation conditions and layout of screens and printouts, and then leaves in the hands of the computer all subsequent troublesome programming process, i.e. control logic aspects. The formalization of Lyee methodology led to the definition of Lyee-Calculus, a formal process algebra, that easily and naturally supports the basic concepts of the Lyee methodology. Moreover, we provided an implementation of the constructs of the Lyee-Calculus in Java language in order to concretely show the efficiency of this calculus and its suitability for the Lyee methodology. In other words, this Java implementation of the Lyee-Calculus provides a means of bridging the gap between Lyee requirement specifications and their implementations.In this paper, we present a new software development environment, LyeeBuilder, that allows to automatically generate applications from specifications using a GUI interface. This software aims to give to programmers an environment that allows them to automatically generate applications from screens and word definitions.  相似文献   

19.
《Software, IEEE》1996,13(6):23-31
Despite rapid changes in computing and software development, some fundamental ideas have remained constant. This article describes eight such concepts that together constitute a viable foundation for a software engineering discipline: abstraction, analysis and design methods and notations, user interface prototyping, modularity and architecture, software life cycle and process, reuse, metrics, and automated support  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号