Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Automated usability testing using HUI Analyzer
Baker S., Au F., Dobbie G., Warren I.  ASWEC 2008 (Proceedings of the 19th Australian Conference on Software Engineering, Mar 26-28, 2008)579-588.2008.Type:Proceedings
Date Reviewed: Jun 25 2009

The HUI Analyzer, based on Microsoft’s .NET Compact Framework for handheld devices, is a proof-of-concept tool for automated usability testing. Details of a user interaction, called an actual action sequence, are gathered unobtrusively by a recorder component. A desktop component provides the investigator with the means to specify expected action sequences. The tool can perform comparison checking of expected and actual action sequences. Assertions concerning forms (for example, the percentage of free space), action sequences (for example, the amount of resizing), and the results of comparison checking (for example, the maximum deviation allowed of an actual from an expected action sequence) can all be checked for. The tool also supports hotspot analyses of actual action sequences to reveal how well various graphical user interface (GUI) components are used.

The evaluation study compares working with the HUI Analyzer to conventional formal user testing. Table 5 shows the usability issues revealed by both approaches and Baker et al. conclude that the HUI Analyzer approach compares favorably. The claim is made that working with the HUI Analyzer takes much less time.

The design and reporting of the evaluation study, however, is weak. Sample sizes of eight are more typical of pilot studies. Subjects’ details are missing. The actual time expended to create expected action sequences and assertions is not stated; there is no effort model.

Despite the shortcomings of the evaluation study, the ideas behind the HUI Analyzer are sound and deserving of a wide audience. I strongly recommend this paper to the software testing community.

Reviewer:  Andy Brooks Review #: CR137012 (1010-1025)
Bookmark and Share
  Reviewer Selected
Featured Reviewer
 
 
Testing Tools (D.2.5 ... )
 
 
Metrics (D.2.8 )
 
 
User Interfaces (H.5.2 )
 
Would you recommend this review?
yes
no
Other reviews under "Testing Tools": Date
Automatic generation of random self-checking test cases
Bird D., Munoz C. IBM Systems Journal 22(3): 229-245, 1983. Type: Article
Aug 1 1985
Program testing by specification mutation
Budd T., Gopal A. Information Systems 10(1): 63-73, 1985. Type: Article
Feb 1 1986
SEES--a software testing environment support system
Roussopoulos N., Yeh R. (ed) IEEE Transactions on Software Engineering SE-11(4): 355-366, 1985. Type: Article
Apr 1 1986
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy