Advances in the understanding of how people use computers have ledto the greatest innovations in software development. Brewster describesthree experiments that show how structured nonverbal audio messagescalled “earcons” can provide navigational cues in anonverbal user interface. Earcons are abstract musical tones that userepetition, variation, and contrast of qualities such as timbre,register, intensity, pitch, and rhythm in structured combinations tocreate sound messages. Experiment 1 used a file system hierarchy of 25nodes on 3 levels. The participants correctly recalled 81.5 percent ofthe 14 earcons they heard, showing the viability of an aural interface.Experiment 2 was designed to generalize these results by addressing theinfluence of sound quality, method of training, and time on thepercentage of earcons recalled. Brewster reports that participants couldstill recall the earcons after a week, but that the quality of sound andthe type of training influenced how well they recalled them. Experiment3 used compound rather than hierarchical earcons to represent thestructure from experiment 1. Earcons were created that did not requirethe participants to remember more than seven rules. The new designimproved the number of earcons recalled from 81.5 percent to 97 percent.This type of earcon design has the advantage of creating arbitrarilysized hierarchies, and participants do not need to be retrained for eachnew structure’s size and shape.
This paper provides interface designers with valuable information.It will be particularly valuable to those who are responsible fordesigning complex displays to clearly present large amounts of changingdata in ways compatible with users’ information needs. Inaddition to developers of applications that usetelephone-based interfaces, mentioned by Brewster, this work should beof particular interest to designers of multimodal computingenvironments.