CPE/CSC 484
User-Centered Interface Design and Development
Winter 2010
| Status |
Draft |
| Points | 10 |
| Deadline | Tueday, March 3, 2010 |
CPE/CSC 484 W10
Assignment 4: Usability Evaluation
Assignment
This assignment consists of usability evaluations of Android mobile phone applications developed in Dr. Janzen's Android class. It is to be performed in a team of about 3-5 people, preferably the same as previous assignments.
Goals and Objectives
- The goal of this assignment is to allow students to get practical experience
with the design, planning, conduct, and aftermath of a usability evaluation
for an existing system.
Description
Your task in this assignment is to perform a usability evaluation of an existing system. Due to time and resource constraints, your evaluation will most likely be of the "quick and dirty" type, trying to identify the most problematic issues with low overhead.
This quarter, we are collaborating with Dr. Janzen's Android class, and will evaluate applications developed by small teams in his class. There are about 15 applications to be evaluated, so each team in our class will evaluate three Android apps.
The Android teams have brief descriptions of their apps available on the course Web site at https://sites.google.com/site/androidhowto/app-ads.
They will complete an alpha version of their apps on 2/26 (see https://sites.google.com/site/androidappcourse/assignments/alpha).
Allocation of Evaluation Teams to Android Projects
On Thu, Feb 25, we will determine which teams will evaluate which Android apps. Please have a look at the available ones beforehand, and identify your favorites. If possible, avoid applications that were developed by other members of your 484 team.
Use the following points as guidelines when writing the evaluation plan:
- System Identification: What system are you evaluating? What is its main purpose, and what are the target users?
- Tasks and Activities: What do you want the evaluation participants to do with the system?
Be sure to identify at least three to five primary ones,
and as many additional ones as you feel are appropriate.
Even if you're doing something like an expert evaluation (where there are no actual participants),
you still need to identify the tasks and activities
- Metrics: Try to develop measurable criteria. If you feel
that subjective impressions are critical, provide guidance to the user
on how to express their impressions (e.g. on some scale such as from 0-5,
or by providing a set of terms to choose from).
- Evaluation form: Create an evaluation form from your metrics
that the participants will use during the actual heuristic evaluation
of the system. Even though your own team members may be the participants,
try to formulate the questions and instructions in such a way that the intended group(s)
will be able to understand them.
Consider the information you would like to get from the participants,
but avoid biased ("leading") questions or statements.
You can use an online questionnaire tool such as Google forms or Surveymonkey,
but be aware of their limitations (like the number of submissions, or computer and internet access).
- Assessment: Based on the individual suggestions and recommendations, put together a common set
of recommendations supported by the whole team.
(Note: This should not simply be a "cut and paste" from
the individual recommendations. You also need to consider
potential conflicts in recommendations, and determine how
to deal with such conflicts.
Background Material
There is an abundance of material available on this topic; a list of references
is given below. You can also refer to the material used by other students and teams in previous quarters,
such as Ngan Phan and Hoang Bao's tablet PC questionnaires, the Internet2 high-bandwidth file transfer
done by Gigi Choy and Rachelle Hom, or Melissa Toy's usability evaluation
for the Amgen Web site from Winter 2005. The material can be found on
the 484 Blackboard discussion forums under the W06 and W05 quarters.
- "Interaction Design" Textbook Chapters 10-14
- "Resonant Interface" preprint Chapter 8
- Chapter 8 of User-Centered Website Development by McCracken & Wolfe
- Chapters 24, 25 and 33 of Human-Computer Interaction by Preece et al.
- Readings on Guidelines and Principles:
- Ben Shneiderman's Designing the User Interface
- Chapter 1 - Guidelines documents bibliography
- Chapter 2 - Principles, Golden Rules and Guidelines
- http://www.usability.gov/guidelines/
- Website Usability Evaluation:
- Heuristic Evaluation:
- examples of good/bad interfaces:
Submission and Deadline
Please post your evaluations on the TRAC Wiki. Create an overview page for your team's evaluation materials,
and add a link to the table with the teams and projects on the Wiki.
Your entry page should have a meaningful title ("Team 4 Evaluation" is not sufficient), a brief description of the topic and your approach,
and a list of team members.
You can post the materials as one large document or as several separate documents;
use the guidelines above for the overall structure.
If the structure for your materials is different, it might be useful to explain the structure you're using.
The deadline is Tuesday, March 3, end of the day (midnight).
Grading Criteria
- Usability method selected (difficulty, appropriateness)
- Supporting material (questionnaire, consent form)
- Data used for the evaluation
- Evaluation of the data
- Assessment
- Presentation of the results
- Recommendations