Usability testing made simple

Developed by Dana Chisnell, Founder, UsabilityWorks

Frame a simple but not-too-broad question you want to answer about how the design works for people.
Find a customer or a prospective user.
Prepare an existing design, product, or prototype to watch the person interact with.
Set up a place that is an appropriate context and environment for the interaction to happen.
Open your mind. Watch and listen without judgment.

Benefits of usability testing

Informs design solutions
Creates satisfying (and even delightful) products.
Eliminates design problems and frustrations.
Creates a historical record of usability benchmarks for future releases.
Development teams employing usability methods are quicker to market.

Puts customer at center of the process
Increases customer satisfaction.
Creates products that are useful and easy to use.
Features are more likely to be popular among users.

Improves profitability
Reduces development costs over the life of a product.
Increases sales and the probability of repeat sales.
Minimizes risk and complaints.

Usability Testing Process

Develop test plan

Review testing goals.
Communicate research questions.
Summarize participant characteristics.
Describe the method.
List the tasks.
Describe the test environment, equipment, and logistics.
Explain moderator role.
List the data you will collect.
Describe how the results will be reported.

Set up environment

Decide on location and space.
Gather and check equipment, artifacts, and tools.
Identify coresearchers, assistants, and observers.
Determine documentation techniques.

Find + select participants

Define the behavior and motivation selection criteria for each user group.
Characterize users.
Define the criteria for each user group.
Determine the number of participants to test.
Screen and select participants.
Schedule and confirm participants.

Prepare test materials

Develop a script for moderator.
Develop task scenarios for participants to perform.
Develop background questionnaire to collect demographic data.
Develop pretest questionnaires and interviews.
Develop post-test questionnaire about experience.

Conduct test sessions

Moderate the session impartially.
Probe and interact with the participant as appropriate.
Don’t “rescue” participants when they struggle.
Have participants fill out pretest questionnaires.
Have participants fill out post-test questionnaires.
Debrief participants.
Debrief observers.

Analyze data + observations

Summarize performance data.
Summarize preference data.
Summarize scores by group or version.
Identify what causes errors and frustrations.
Conduct a source of error analysis.
Prioritize problems.

Report findings + recommendations

Focus on solutions that will have the widest impact.
Provide short- and long-term recommendations.
Take business and technology constraints into account.
Indicate areas where further research is required.
Create a highlights video.
Present findings.