Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Automated Oracle comparators for TestingWeb applications
Sprenkle S., Pollock L., Esquivel H., Hazelwood B., Ecott S.  Software reliability (Proceedings of the The 18th IEEE International Symposium on Software Reliability (ISSRE ’07), Nov 5-9, 2007)117-126.2007.Type:Proceedings
Date Reviewed: Jun 11 2008

Web applications can be tested automatically by looking for changes in Hypertext Markup Language (HTML) output using a previous working version of the application as the oracle. Many properties of HTML code are best ignored, however, when looking for changes indicative of failure. For example, the absence of an explicit closing tag may have no practical consequence. So what makes for the best oracle comparator? The authors sought to answer this question by constructing a suite of 22 automated oracle comparators, and performing two experiments on four applications. The first experiment analyzed the effect of nondeterministic/real-time application behavior, and gauged the false positives generated due to, for example, a change of date. The second experiment involved the seeding of faults to assess precision and recall in failure detection.

The results of the first experiment were as expected, with Figure 4 clearly showing how false positives follow the partial ordering of oracles based on the HTML structure and content being considered. The results of the second experiment found that, across all four applications, the Forms-Select comparator is best, though this is not easy to discern from Figure 5 alone. Different comparators are recommended for use to minimize either false negatives or false positives.

The value of this approach to Web application testing in a practical context is, however, far from obvious. New or changing user requirements typically lead to changes in HTML code, and the false positive rate is likely to be unacceptably high. Nevertheless, this paper is recommended to the Web engineering community.

Reviewer:  Andy Brooks Review #: CR135703 (0905-0470)
Bookmark and Share
  Featured Reviewer  
 
Testing Tools (D.2.5 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Testing Tools": Date
Automatic generation of random self-checking test cases
Bird D., Munoz C. IBM Systems Journal 22(3): 229-245, 1983. Type: Article
Aug 1 1985
Program testing by specification mutation
Budd T., Gopal A. Information Systems 10(1): 63-73, 1985. Type: Article
Feb 1 1986
SEES--a software testing environment support system
Roussopoulos N., Yeh R. (ed) IEEE Transactions on Software Engineering SE-11(4): 355-366, 1985. Type: Article
Apr 1 1986
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy