Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Static analysis tools as early indicators of pre-release defect density
Nagappan N., Ball T.  Software engineering (Proceedings of the 27th International Conference on Software Engineering, St. Louis, MO, USA, May 15-21, 2005)580-586.2005.Type:Proceedings
Date Reviewed: Feb 10 2006

Static analysis tools have been used to detect pre-release defects at Microsoft for six years. More than 12 percent of the pre-release defects fixed in Windows Server 2003 were found with the PREfix and PREfast static analysis tools. This paper uses historical data to determine how well statically found defects can predict the pre-release defect density, as measured by defects found by all other pre-release methods.

Data was analyzed at the component level for over 199 components of Windows Server 2003 (22 million lines of code). Employing the technique of data splitting, random samples of 132 components were used to build regression models whose predictive ability was assessed on the remaining 67 components. Figure 3 shows how the estimated defect density tracks the actual defect density for three random samples. A discriminant analysis is said to correctly identify 165 of the 199 components (82.91 percent) as fault, or not fault, prone.

An omission is the failure to report the false positive rates for PREfix and PREfast. An indication is given that some false positives might have been entered into the defect database. While Figure 3 demonstrates prediction tracking in general, at least three components with much higher defect densities appear that are not tracked by regression modeling. Why were these particular components so much worse? Could other techniques, for example, software metric approaches, have predicted that these components were very fault prone? We do not know. This paper is recommended to those working in software quality assurance.

Reviewer:  Andy Brooks Review #: CR132417 (0612-1258)
Bookmark and Share
  Reviewer Selected
Featured Reviewer
 
 
Symbolic Execution (D.2.5 ... )
 
 
Process Metrics (D.2.8 ... )
 
 
Software Quality Assurance (SQA) (D.2.9 ... )
 
 
Testing Tools (D.2.5 ... )
 
 
Testing And Debugging (D.2.5 )
 
Would you recommend this review?
yes
no
Other reviews under "Symbolic Execution": Date
Applications of symbolic evaluation
Clarke L., Richardson D. Journal of Systems and Software 5(1): 15-35, 1985. Type: Article
Nov 1 1985
Symbolic evaluation as a basis for integrated validation
Ploedereder E.  Software validation: inspection-testing-verification-alternatives (, Darmstadt, West Germany,1851984. Type: Proceedings
Dec 1 1985
Prototyping symbolic execution engines for interpreted languages
Bucur S., Kinder J., Candea G.  ASPLOS 2014 (Proceedings of the 19th International Conference on Architectural Support for Programming Languages and Operating Systems, Salt Lake City, UT, Mar 1-5, 2014)239-254, 2014. Type: Proceedings
Apr 9 2014
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy