Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
An empirical validation of software cost estimation models
Kemerer C. Communications of the ACM30 (5):416-429,1987.Type:Article
Date Reviewed: Mar 1 1989

Four software cost-estimation models (SLIM, COCOMO, Function Points, and ESTIMACS) were evaluated. The models’ estimates of cost--in man-months (MM) of effort--were compared to actual effort data for 15 completed business data processing projects.

These results were found: (1) Models not originally developed in a business data processing environment strongly require calibration for such an environment (which, as the author acknowledges, is to be expected). (2) The Function Points effort-estimation model is validated by the data in this study.

Papers that describe and compare software cost-estimation models can be very helpful to readers who want to know if such models might be effective in their organizations (see, for example, [1] for a related report). However, the paper being reviewed is disappointing with respect to its primary thrust: how well do the four estimation models perform against actual cost data for the 15 projects? Consider the performance of the first model discussed, SLIM. Here the values of key SLIM model parameters were determined (in accordance with the SLIM method) by the answers to 22 questions put to the user. On the first project, SLIM estimated a 3,858-MM effort, while the actual effort was 287 MM, giving an error of 1,244 percent. The author proceeded to apply the model, using the same initial parameter values, to the remaining 14 projects, with the result that the effort on each of the 15 projects was overestimated by an average of 772 percent.

Three versions (basic, intermediate, and detailed) of the second model, COCOMO, were applied to the same 15 projects. The result was that effort was again overestimated in all 45 cases by an average of 601 percent. (A model that estimates 0 MM in all cases, and therefore has only 100 percent error, would have fared well in this evaluation]) A third model, Function Points, includes man-month estimates for two projects that are negative numbers.

My disappointment lies in the unrealistic use of the models. Admittedly, the models did overestimate the actual effort. But the 15 projects are highly coherent, and intentionally so. They involve business data processing applications, mostly in COBOL, developed by a single company within the span of a few years. As the author notes, these circumstances encourage a high level of consistency in staff quality and in methodology used. So, if the models overestimate one project and there is no learning effect or adjustment of parameters by the user, it is understandable if the overestimation continues, as it did, 60 out of 60 times for the first two models. It does not seem useful to repeat essentially the same test 60 times and say the average overestimate is 600–700 percent.

The 15 projects are not described as being ordered chronologically, but the results would have had some measure of realism if the analysis had considered them to be in order. Then, after the first project was overestimated by 1,224 percent, or certainly after the first few huge overestimates, the model parameters could have been adjusted before the next project was estimated. The author did not take advantage of having data for 15 projects. He should have applied the models to each project in turn, with opportunities for parameter adjustment along the way, in order to reflect the way cost-estimation models are used by organizations. These models are tools that require a contribution from the user in an ongoing process of refinement and sequential decision making.

Reviewer:  W. W. Agresti Review #: CR125431
1) A Descriptive Evaluation of Automated Software Cost-Estimation Models. (IDA Paper P-1979, October 1986). Defense Technical Information Center, Defense Logistics Agency, Alexandria, VA.
Bookmark and Share
 
Heuristic Methods (I.2.8 ... )
 
 
Cost Estimation (D.2.9 ... )
 
 
Management Techniques (K.6.1 ... )
 
 
Performance Measures (D.2.8 ... )
 
 
Software Development (K.6.3 ... )
 
 
Management (D.2.9 )
 
  more  
Would you recommend this review?
yes
no
Other reviews under "Heuristic Methods": Date
Embedding decision-analytic control in a learning architecture
Etzioni O. (ed) Artificial Intelligence 49(1-3): 129-159, 1991. Type: Article
Sep 1 1992
The complexity of the Lin-Kernighan heuristic for the traveling salesman problem
Papadimitriou C. SIAM Journal on Computing 21(3): 450-465, 1992. Type: Article
May 1 1993
Toward combining empirical and analytical methods for inferring heuristics
Mitchell T. (ed)  Artificial and human intelligence (, Lyon, France,1031984. Type: Proceedings
Aug 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy