Assignment 1: Heuristic Usability Evaluation
Assignment
In this assignment, students will utilize relatively simple heuristic guidelines to conduct a usability evaluation of computer-based systems, including ubiquitous or embedded computer systems where the user may not be aware of the presence of a computing device. The heuristic evaluation is accompanied by a simple experiment involving a small number of users performing the same task as in the heuristic evaluation. It is an individual or pair assignment. You can coordinate this assignment with your team mates (e.g. by using a similar evaluation plan), but the write-up and submission must be done as individuals or pairs of students.
Goals and Objectives
The goal of this assignment is to allow students to become familiar with some of the advantages and limitations of heuristic usability evaluations, while also collecting experience in conducting simple, informal usability evaluation experiments.
Description
Your task in this assignment is to perform a heuristic usability evaluation of an existing system. This evaluation will rely on Nielsen’s heuristic usability evaluation framework, or something similar. Such “quick and dirty" evaluations are used to identify the most problematic issues of a design early on and with low overhead.
Choice of System to be Evaluated
Students can select their own system or device to be evaluated. However, it should satisfy the following requirements:
- the application, system, or device should be related to the topic your team selected as class project
- there must be a meaningful set of interactions between the user and the system; for example, a digital set of dice where the only meaningful interaction is to press a button for a new pair of values is not sufficient,
- the selected product must be available for evaluation, testing and demonstration purposes
Tasks or Scenarios
For your evaluation, you need to define a specific task or scenario performed by a user with the system. The task should be complex enough for users to encounter difficulties, but not overly challenging or lengthy. You should try to formulate a task that takes about 5 minutes to perform, and a typical user should have a good chance of completing it. You are expected to produce a document, the evaluation plan, to be presented to participants in an experiment, and to be used as a guideline for your own heuristic evaluation.Use the following points as guidelines when writing the evaluation plan:
- System Identification: What system are you evaluating? What is its main purpose, and what are the target users?
- Tasks and Activities: What do you want the evaluation participants to do with the system? Be sure to identify at least three to five primary ones, and as many additional ones as you feel are appropriate. Even if you're doing something like a heuristic or expert evaluation (where there are no actual participants), you still need to identify the tasks and activities.
- Metrics: Try to develop measurable criteria. If you feel that subjective impressions are critical, provide guidance to the user on how to express their impressions (e.g. on some scale such as from 0-5, or by providing a set of terms to choose from).
- Evaluation form: Create an evaluation form from your metrics that the participants will use during the actual heuristic evaluation of the system. Even though your own team members may be the participants, try to formulate the questions and instructions in such a way that the intended group(s) will be able to understand them. Consider the information you would like to get from the participants, but avoid biased ("leading") questions or statements. You can use an online questionnaire tool such as Google forms or Surveymonkey, but be aware of their limitations (like the number of submissions, or computer and internet access).
- Assessment: Based on the individual suggestions and recommendations, put together a common set of recommendations supported by the whole team. (Note: This should not simply be a "cut and paste" from the individual recommendations. You also need to consider potential conflicts in recommendations, and determine how to deal with such conflicts.
Heuristic Evaluation
Based on the evaluation plan, use Nielsen’s heuristic usability evaluation framework, or something similar, to examine the interaction between a user and the device. For heuristic evaluations, the assumption is that the evaluator is familiar both with the expected behavior of the user (the “user model”), and with the problem area for the task (the “domain model”). In practice, evaluators are also often familiar with the system design and the functionality of the device. There are two common ways of conducting such an evaluation: One is task-oriented, where you perform the task according to your evaluation plan, and take notes about actual and potential problems you or a typical user may encounter. The other is to take Nielsen’s framework as a checklist, and examine the interaction between user and device according to the issues specified in the framework.
Usability Experiment
Using the same evaluation plan, conduct an experiment with at least five users other than yourself. Ideally the users should be representative of the “typical” users of the system or device. If it is impractical to find such participants for the experiment, you can recruit class mates, room mates, friends, relatives, or students in public areas.
Before you actually conduct the experiment, you need to inform the participants about the purpose and tasks of the experiment. It is common practice to use "informed consent" forms that participants sign at the beginning. You can find examples on Cal Poly’s Human Subjects Committee Web page, in particular the “Sample Informed Consent Form.” As long as such experiments are conducted as regular class activities, they do not require approval by the Human Subjects Committee. If you are planning to publish this in any form (including Master’s theses or senior projects submitted to the library, Cal Poly’s policies require the approval of the experiment but the committee.
During the experiment, you should observe the participant, and take notes of interesting events, such as difficulties encountered, frustration and annoyance, or surprise. For complicated experiments it is advisable to have an observer and a facilitator; the role of the latter is to assist the participant if necessary.
After the experiment, take some time to write down observations about the behavior of the participant and properties of the interaction between participant and device. You can also ask the participants questions about their experience, or give them a brief survey.
Background Material
There is an abundance of material available on this topic:
- a good start is Nielsen’s Web page with his writings on heuristic evaluation
- the Heuristic evaluation Wikipedia entry
- the heuristic evaluation entry in the Usability Body of Knowledge
- Chapters 24, 25 and 33 of Human-Computer Interaction by Preece et al.
- Readings on Guidelines and Principles:
- Ben Shneiderman's Designing the User Interface
- Chapter 1 - Guidelines documents bibliography
- Chapter 2 - Principles, Golden Rules and Guidelines
- http://www.usability.gov/guidelines/
- Rocket Surgery Made Easy - The Do-It-Yourself Guide to Finding and Fixing Usability Problems. Steve Krug / New Riders / 2009 / 168 pages. The Web site contains a few sample documents, and a demo video of a usability test.
- Website Usability Evaluation:
- Finding Usability Bugs with Automated Tests by Julian Hartley, ACM Queue vol 9, no 1.
- Heuristic Evaluation:
- examples of good/bad interfaces:
Deliverables
You need to submit the following items:- an evaluation plan for the heuristic evaluation and the experiment
- description of the insights gained from the heuristic evaluation
- discussion of the practical aspects of the experiment (e.g., environmental factors such as noise; modifications of the tasks or procedures used; time required)
- an evaluation and assessment of the results, including the heuristic evaluation and the experiment
- recommendations to improve the usability of the system or device
- supporting material, in particular an informed consent form
Submission and Deadline
Most likely, we will use either PolyLearn or a Wiki repository for this assignment; details will be discussed in class.
The deadline for this assignment is the end of the day (midnight) as listed in the schedule.
Grading Criteria
- Usability method selected (difficulty, appropriateness)
- Data used for the evaluation
- Evaluation of the data
- Assessment
- Presentation of the results
- Recommendations
- Supporting material (questionnaire, consent form)