Choropleth maps are maps with geo-referenced data presented visually, in which predefined regions are colored to show how a variable differs between regions. They are used to help users explore data collections for problem solving and decision making (for example, the US Census). Currently, visually impaired users rely on screen readers to linearly speak the geographic region names, and to present data as table records. The problem is that such a linear textual presentation makes it difficult for such users to locate specific data, and to understand data patterns in a geographic context.
The researched reported on in this paper proposes using sonifications that synchronize visual and auditory presentations by applying two design guidelines: first, conform to auditory information-seeking principles (AISP), and, second, have minimum requirements for special software and hardware. The AISP proposed consists of gist (a short auditory message presenting the overall trend or pattern of a data collection), navigating (a user flying through the data collection, selecting and listening to portions of it), filtering (filtering out unwanted data items), and details on demand (item selection for further detail, perhaps with a spoken presentation).
The developed sonification system was tested in a pilot user study, and then in a formal experiment. Finally, it was tested informally with two visually impaired individuals (the pilot study used nine sighted users, and the experiment used 48 sighted subjects). Two navigational interfaces were tested (state-by-state column and cell-by-cell mosaic) using user key controls. The pilot study guided the system, the experiment, and proposed future work.