Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
DeepFL: integrating multiple fault diagnosis dimensions for deep fault localization
Li X., Li W., Zhang Y., Zhang L.  ISSTA 2019 (Proceedings of the 28th ACM SIGSOFT International Symposium on Software Testing and Analysis, Beijing, China, Jul 15-19, 2019)169-180.2019.Type:Proceedings
Date Reviewed: Nov 21 2019

Testing is a very important activity in software development. With testing comes the need to find the origin of the defects detected. This need is also important when failures occur in production. Finding the origin of such problems is known as fault localization.

This paper is an empirical study on the effectiveness and efficiency of 13 fault localization techniques belonging to seven families of techniques, including spectrum-based fault localization, mutation-based fault localization, and dynamic program slicing. The number of techniques by family ranges from one to three. The techniques are applied to five software packages with real faults. According to the authors, this is the first study to involve such a high number of fault localization families and techniques. The paper includes a very clear section describing the techniques analyzed.

In the study, the authors measure both localization performance and efficiency. The main focus is at the statement level, but the authors also assess performance at the method level, as in other studies. The authors also study the techniques individually and combined. To combine the techniques, the authors use learning-to-rank, a machine learning approach. The paper discusses the main limitations of using this approach.

The study answers six research questions, resulting in several findings. I want to highlight two findings: 1) combining the techniques is advantageous over using them individually, and this is true at both the method and statement levels; and 2) the techniques from the spectrum-based fault localization family are the most effective. From the results, the authors conclude that techniques that use the same information perform similarly, thus it is more promising to find new information sources than to optimize existing ones. Finally, the time needed to apply the various localizations varied, with mutation-based techniques taking the most time.

The authors provide the study’s infrastructure, which allows new techniques to be used individually or combined. What I found somewhat strange was the absence of a section about the threats to validity. Additionally, not all measurements are statistically tested, and the authors do not present an explanation for that. Table 9 is not thoroughly explained.

Some of the techniques have existed for decades, but their use has been very limited. This area will see major development in the next few years. If you have any interest in this area, take a look at this paper.

Reviewer:  Alberto Sampaio Review #: CR146789 (2005-0111)
Bookmark and Share
 
Testing And Debugging (D.2.5 )
 
 
Verification (B.1.4 ... )
 
 
General (D.0 )
 
Would you recommend this review?
yes
no
Other reviews under "Testing And Debugging": Date
Software defect removal
Dunn R., McGraw-Hill, Inc., New York, NY, 1984. Type: Book (9789780070183131)
Mar 1 1985
On the optimum checkpoint selection problem
Toueg S., Babaoglu O. SIAM Journal on Computing 13(3): 630-649, 1984. Type: Article
Mar 1 1985
Software testing management
Royer T., Prentice-Hall, Inc., Upper Saddle River, NJ, 1993. Type: Book (9780135329870)
Mar 1 1994
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy