Usability Testing of Community Data and Mapping SystemsDenice Warren, Chief Information Systems Designer, [email protected]Joy Bonaguro, Web and Data Production Coordinator, [email protected]
IntroductionWeb-based data and mapping systems can be powerful tools for the public to access and visualize complex information. However, the inherent complexity of these web systems can itself create a barrier between the public and the information they need. Philosophy of user-centered designA philosophy of user-centered design, coupled with a culture of continuous quality improvement through formal usability testing, can create a web system that meets the public where they are and capitalizes on people's existing strengths in finding and using information. A highly usable web site allows users to focus on the information within the system, rather than struggling with how to use the system itself. Usability testing is a natural extension of the community engagement process, and having an easy-to-use site is an essential element in respecting the community audience. IBM has published a "User Bill of Rights" that summarizes the philosophy underlying true user-centered design (IBM). User Rights (verbatim from http://www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/12)
What is usability testing?Usability is formally defined as the effective, efficient and satisfying completion of tasks by users (Lee, 1999, pp. 38). In a community data and mapping system, task completion might include printing a thematic map of one's neighborhood, downloading data about crime rates within a city, or analyzing statistical phenomena within a specific community. Usability testing is a formal method of watching users interact with a system to complete a task. In such testing, a naïve "typical" user is given realistic tasks to complete on the web site. A variety of qualitative and quantitative data is gathered while the user navigates to complete the task. An analysis of the data from such testing then informs the iterative design of these web systems to better meet the needs of the audience. Rationale
for usability testing
|
Clue to usability problem | Sample research question | Sample user task |
From the web site server statistics, the "most popular pages" viewed by visitors to the site do not include what you consider to be the most important pages in your site (such as the page that gives the technical definitions for the indicators you publish).* | Is the 'Definitions' link sufficiently visible? | How many blighted houses are in the Holy Cross neighborhood?(In order to answer this question, users must use the 'Definitions' link to learn the difference between blighted and vacant houses.) |
Functionality that you spent a great deal of resources to create doesn't get used much (such as a feature that allows users to create custom neighborhood boundaries). *, ** | Is the 'Define your neighborhood boundaries' link visible? Once people get to the feature for defining their own neighborhood, is it clear what to do? Is the tool for selecting the custom neighborhood geography intuitive? Are there other barriers to using this feature? | Use the web site to create a map showing homeownership rates in the neighborhood your nonprofit serves. (To answer this question, users must use the 'Define your neighborhood boundaries' feature.) |
People ask you how to find information that should be easy to find on the site (e.g., "Do y'all have data on teen births?"). | Is data on teen pregnancy in a predictable category? | What is the rate of teen pregnancies in St. Bernard Parish? (To answer this, users must click on the category that contains the teen pregnancy data.) |
The site seems to generate the same questions in people's heads, and those questions aren't answered by the site (e.g., "What Census tract am I in?"). | Do the maps provide enough geographic detail to help users choose their area(s) of interest? | Download the data profile for the Census tract in which you live. (This task requires users to find their Census tract on the map.) |
Server statistics show that one of the most common exit pages is a key navigation page (such as a required registration page). | What's happening when people land on the registration page? Is usability a barrier? Is the form too long? Are users suspicious of our intent? | Any user task that requires users to register would also answer this question. (You don't want to directly task people with registering because that lends false motivation; instead you want to see what happens when registering is a means to another end.) |
User characteristics (domains, browser versions, operating systems, etc.) that show up in the server statistics don't match what you would expect for your target audience. | How do our pages look on the computers of our users? Do our pages download quickly enough? Do the pages print well on their printers? | These questions are answered a little bit by every task. (In order to be complete, conduct some basic "web site calisthenics" on the user's computer.) |
Design decisions that you debated during the production of the site, or that you have a 'weird feeling' about. | Will people expect to find voter registration information under the "community participation" category? | How many registered voters are there in St. Tammany parish compared to the state as a whole? [This puts the search term 'voters' in the user's head and will test whether they expect to find 'voters' in 'community participation.'] |
The user tasks are printed and taped onto index cards. These cards help minimize cueing by the interviewer, and provide some degree of standardization across testing sessions.
2. Recruit users to test the site, and then conduct the tests
This composite test protocol has the following major elements (more detailed testing protocols and sample scripts and release forms are available for download at www.gnocdc.org/usability/):
3.
Analyze the results of testing and make design changes accordingly
Usability testing will generate a combination of qualitative and quantitative data. You'll want to craft the analysis so it answers the initial research questions and leads to recommendations for redesign. For each research question, consider the data gathered for all of the user tasks addressing that question. From the quantitative data generated by user tracking, you can determine the number of clicks it took to get from the home page to complete the task. (This number can be compared to the minimum number of clicks required to complete the task; a greater number of clicks in the user testing may indicate lack of efficiency in the design.) When you combine that with notes taken from the user thinking aloud as they attempted to complete the task, you may gain insight into where they encountered troubles, and whether they were frustrated at specific design features.
Table 2: Sample analysis of results and design recommendation
Research
question
|
User
task
|
Analysis
of results
|
Design
recommendation
|
Will people expect to find voter registration information under the "community participation" category? | How many registered voters are there in St. Tammany parish compared to the state as a whole? |
3 user took an average of 7 clicks to find the target data (compared to only 3 clicks minimum required from home page). All users went to the page with "People & Households" data rather than "Community Participation" data. One user commented, upon finding the data on the "Community Participation" page, "Hmm I was expecting to find information on neighborhood watch groups and church activities here, not voting." |
Create a link especially for "Voting," since nobody expected "Voting" data to be in the vaguely-titled category of "Community Participation." |
After redesigning the site based on feedback from usability testing, you'll want to test the site again, using the same tasks but new, naïve, users somewhat matched to the users in the first cohort. This way you can determine whether your design change was indeed an improvement. Also, compiling results that show a positive effect from usability testing help justify its continued role in your organization (and budget).
Usability testing is an essential technique in the continuous improvement of community data and mapping systems. As described in this document, users benefit because they are able to access the information they need more efficiently and with less frustration. The organization responsible for the web site benefits from added credibility. Additional side benefits are that the web site development team develops a greater understanding of the audience, making future development efforts more efficient and better targeted. And, the development team can more confidently make design decisions with the right information in front of them. Users involved in usability testing often turn out to be enthusiastic advocates of your site and can accelerate word of mouth marketing to reach out to new visitors.
Return on investment (ROI), the metric typically applied in commercial web sites to judge the worth of a usability initiative, is difficult to measure in a community web site that is a public good. However, the cost of not conducting usability testing is unarguably too high - users are likely to feel disrespected, not trust your site, and might not come back.
Bernard, Michael L, 2001. "Criteria for Optimal Web Design: How can I reduce the major user annoyances on my site?" Software Usability Research Laboratory, Wichita State University. Retrieved 9 June 2003: http://psychology.wichita.edu/optimalweb/annoyances.htm
Cartwright, W., Crampton, J., Gartner, G., Miller, S., Mitchell, K., Siekierska, E., and Wood, J., 2001. "Geospatial Information Visualization User Interface Issues." Cartography and Geographic Information Society, Vol. 28, No. 1, January 2001. Retrieved 9 June 2003: http://www.geovista.psu.edu/sites/icavis/agenda/PDF/Cartwright.pdf
Fogg, B.J., Marshall, J., Laraki, O., Osipovich, A., Varma, C., Fang, N., Paul, J., Rangnekar, A., Shon, J., Swani, and P., Treinen, M., 2001. "What Makes Web Sites Credible? A Report on a Large Quantitative Study." Persuasive Technology Lab. Stanford University. Available at www.webcredibility.org
Fogg, B.J., Kameda, T., Boyd, J., Marshall, J., Sethi, R., Sockol, M., and Trowbridge, T. (2002). "Stanford-Makovsky Web Credibility Study 2002: Investigating what makes Web sites credible today." A Research Report by the Stanford Persuasive Technology Lab & Makovsky & Company. Stanford University. Available at www.webcredibility.org
Haklay, M., and Tobon, C., 2002, Usability Engineering and PPGIS: Towards a Learning-improving Cycle, presented at the 1st Annual Public Participation GIS Conference, Rutgers University, New Brunswick, New Jersey, 21st-23rd July.Available at: http://www.casa.ucl.ac.uk/muki/pdf/Haklay-Tobon-URISA-PPGIS.pdf
IBM. "User rights: The customer is always right." IBM Corporation. Retrieved 9 June 2003: http://www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/12
Lee, Alfred T, 1999. "Web Usability: A Review of the Research," in The SIGCHI Bulletin. Vol. 31, No.1, January 1999. Edited by Ayman Mukerji. Minneapolis, MN. Pp 38-40. Retrieved 9 June 2003: http://www.acm.org/sigchi/bulletin/1999.1/lee.pdf
Microsoft Corporation, 2000. "UI Guidelines vs. Usability Testing." MSDN Library. Retrieved 9 June 2003: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnwui/html/uiguide.asp
Nielsen, Jacob, 1997. "The Use and Misuse of Focus Groups." Alertbox. Retrieved 9 June 2003: http://www.useit.com/papers/focusgroups.html
Nielsen, Jacob, 2003. "Employee Directory Search: Resolving Conflicting Usability Guidelines." Alertbox. Retrieved 9 June 2003: http://www.useit.com/alertbox/20030224.html
Selvidge, Paula, 1999. "How Long is Too Long to Wait for a Website to Load?" Usability News Vol. 1, Issue 2. Retrived 9 June 2003: http://psychology.wichita.edu/surl/usabilitynews/1s/time_delay.htm
Selvidge, Paula, 2003. "Examining Tolerance for Online Delays." Usability News Vol.5, Issue 1. Retrived 9 June 2003: http://psychology.wichita.edu/surl/usabilitynews/51/delaytime.htm
Slocum, Terry A., Blok, C., Jiang, B., Koussoulakou, A., Montello, D.R., Fuhrmann, S., and Hedley, N.R., 2001. "Cognitive and Usability Issues in Geovisualization." Cartography and Geographic Information Society, Vol. 28, No. 1, January 2001. Retrieved 9 June 2003: http://www.geovista.psu.edu/sites/icavis/agenda/PDF/SlocumLong.pdf
Usability.gov. "Methods for Designing Usable Web Sites." Retrieved 9 June 2003: http://www.usability.gov/methods/data_collection.html
Greater New Orleans Community Data Center
[email protected] Last modified: July 28, 2003 |