Usability Testing for Community
Testing of Community Data and Mapping Systems
Usability testing is the only way to ensure that a web site designed
for the public is truly usable. Devoting resources to such testing pays
off by making a web site more efficient, effective, and credible, and
leaves a trail of satisfied users who come back to the site and recommend
it to their colleagues. This document summarizes research behind why usability
testing is important, especially in web-based GIS, and concludes with
a basic protocol for applying usability testing to community data and
mapping systems. Read this paper >
(1.5 MB) at PPGIS Conference,
July 2003 (Note: This PPT presentation provides an overview, the paper
above provides more detail and rationale.)
and templates for testing
This is the big usability testing planning document we used. It includes
all of our research questions, with references to which user tasks corresponded
to each question. It also covers criteria we had for participants, the
plan for recruiting, techniques to be used during testing, a checklist
for items to bring to the test, and an overview of the testing protocol.
We used this sheet as a guide for the initial phone screening of testers.
Screening is used to get testers within the target audience, and then,
in the second round of testing, to roughly match testers with the first
for beginning Usability Testing
This is the script that our primary interviewer used to begin the testing
session once we arrived on site.
A release form to be signed by the usability tester. It gives us permission
to use use data from the testing session, but assures the tester that
we won't use the data for other purposes. Also included is a clause that
allows us to temporarily track pages visited from their computer, and
something about compensation. (Note: You'd want to print this on
These questions expand on information gathered during the initial screening.
Questions were chosen to help us understand the test results better. For
example, if someone lives in a particular neighborhood, then they might
be able to navigate the maps in that area better than someone from another
While the interviewer is going through the Script, Release Form, and Questionnaire,
the notetaker conducts a series of calisthenics on the user's computer
to see how well the web site displays and prints on the user's system,
as well as getting basic information about the system specs. At this point,
the computer is also logged in to the cookie-based tracking system so
that clicks through the web site can be analyzed later.(The tracking system
is described in this document, as well.)
Usability Testing Tasks
Both the interviewer and notetaker have copies of this document on a clipboard.
Each user task is printed (one per page) with notes for the interviewer
about possible prompts and space for the notetaker (and interviewer) to
take notes about user behavior on each task. The interviewer arrives at
the test with a set of index cards, with each written task cut and taped
onto individual note cards.
This document restates the research questions in the Sample Test Plan
and then answers each question, referring to the quantitative click-tracking
data gathered as well as the notes taken as users were thinking aloud.
The report spans two cohorts of usability testing, and describes the site
changes that took place after the first cohort of testers. The second
round of testing was identical to the first, and confirmed the effectiveness
of the initial design changes. Plus, it resulted in new design improvements
and research questions. (Note: This report was written for in-house
use, so contains a lot of organizational jargon. A report to be given
to an outside web designer, or a board, would need to be much more concrete
and complete. Writing the report can be the most time-consuming part of
relevant usability articles on the web
Office Politics out of Design
A set of practical tips that includes using results from usability testing
to de-politicize the web design process.
Test Users, Test Hypotheses
"When testing websites or applications, I've found that generating
hypotheses about user behavior helps inform the observation process, structure
data collection and analysis, and organize findings. It also keeps you
honest by being explicit about what you are looking for."
of the Stupid User articles
Myth of the Stupid User
of the stupid users" is common within large sections of the software
and web development communities. This might seem a relatively harmless
belief, but it has a significant effect on the quality of Internet sites,
on the extent to which sites can be used by their intended audience, and
on the extent to which sites meet business and marketing requirements."
We Learn From Stories About Stupid Users?
"Is the user
really a stupid person who is completely ignorant of the things in the
world? No, not really, but at least he or she is seemingly regarded as
such. The stupid user is a very popular myth, an illusion which is often
used to defend stupid programmers or system designers. It is very handy
to be able to put the blame on a person who can be pointed to as acting
funny, knowing nothing, and who cannot even complain on problems in an
is broken" web site
A new project to
make businesses more aware of their customer experience, and how to fix
usability web sites on the web
Sponsored by the National Cancer Institute this website explains why usability
is important, reviews different usability methods, provides guidelines
and checklists, and catalogues other online resources. It also covers
accessibility issues and provides Internet and search engine research
This website publishes
recent articles in usability and tracks conferences and events in the
A collection of book descriptions on a range of topics from accessibility
to return on investment (ROI) in the field of usability, organized by
Usability Resources contains bibliographies, web references, files to
download and other material relating to usability activities.
Software Usability Research Laboratory's Newsletter
The SURL team specializes in software/website user interface design research,
human-computer interaction research, and usability testing and research.
They publish the results through their online newsletter.
Optimal Web Design
This section of SURL focuses on specific articles in designing and building
Testing of Community Data and Mapping Systems"
L, 2001. "Criteria for Optimal Web Design: How can I reduce the major
user annoyances on my site?" Software Usability Research Laboratory,
Wichita State University. Retrieved 9 June 2003: http://psychology.wichita.edu/optimalweb/annoyances.htm
Crampton, J., Gartner, G., Miller, S., Mitchell, K., Siekierska, E., and
Wood, J., 2001. "Geospatial Information Visualization User Interface
Issues." Cartography and Geographic Information Society, Vol. 28,
No. 1, January 2001. Retrieved 9 June 2003:
Fogg, B.J., Marshall,
J., Laraki, O., Osipovich, A., Varma, C., Fang, N., Paul, J., Rangnekar,
A., Shon, J., Swani, and P., Treinen, M., 2001. "What Makes Web Sites
Credible? A Report on a Large Quantitative Study." Persuasive Technology
Lab. Stanford University. Available at www.webcredibility.org
Fogg, B.J., Kameda,
T., Boyd, J., Marshall, J., Sethi, R., Sockol, M., and Trowbridge, T.
(2002). "Stanford-Makovsky Web Credibility Study 2002: Investigating
what makes Web sites credible today." A Research Report by the Stanford
Persuasive Technology Lab & Makovsky & Company. Stanford University.
Available at www.webcredibility.org
Haklay, M., and
Tobon, C., 2002, Usability Engineering and PPGIS: Towards a Learning-improving
Cycle, presented at the 1st Annual Public Participation GIS Conference,
Rutgers University, New Brunswick, New Jersey, 21st-23rd July.Available
rights: The customer is always right." IBM Corporation. Retrieved
9 June 2003: http://www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/12
Lee, Alfred T,
1999. "Web Usability: A Review of the Research," in The SIGCHI
Bulletin. Vol. 31, No.1, January 1999. Edited by Ayman Mukerji. Minneapolis,
MN. Pp 38-40. Retrieved 9 June 2003: http://www.acm.org/sigchi/bulletin/1999.1/lee.pdf
2000. "UI Guidelines vs. Usability Testing." MSDN Library. Retrieved
9 June 2003: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnwui/html/uiguide.asp
1997. "The Use and Misuse of Focus Groups." Alertbox. Retrieved
9 June 2003: http://www.useit.com/papers/focusgroups.html
2003. "Employee Directory Search: Resolving Conflicting Usability
Guidelines." Alertbox. Retrieved 9 June 2003: http://www.useit.com/alertbox/20030224.html
1999. "How Long is Too Long to Wait for a Website to Load?"
Usability News Vol. 1, Issue 2. Retrived 9 June 2003: http://psychology.wichita.edu/surl/usabilitynews/1s/time_delay.htm
2003. "Examining Tolerance for Online Delays." Usability News
Vol.5, Issue 1. Retrived 9 June 2003: http://psychology.wichita.edu/surl/usabilitynews/51/delaytime.htm
A., Blok, C., Jiang, B., Koussoulakou, A., Montello, D.R., Fuhrmann, S.,
and Hedley, N.R., 2001. "Cognitive and Usability Issues in Geovisualization."
Cartography and Geographic Information Society, Vol. 28, No. 1, January
2001. Retrieved 9 June 2003: http://www.geovista.psu.edu/sites/icavis/agenda/PDF/SlocumLong.pdf
"Methods for Designing Usable Web Sites." Retrieved 9 June 2003:
Greater New Orleans Community Data Center
© 2000-2002 Greater
New Orleans Community Data Center.
All Rights Reserved.
April 3, 2006