Ellen A. Isaacs
SunSoft, Inc.
2550 Garcia Ave.
Mountain View, CA 94043-1100
USA
ellen.isaacs@sun.com
+1 415 336 1167
For many people - engineers, tech writers, managers and human interface professionals alike - interviewing customers is an unfamiliar and challenging task. If they have not been trained in the mechanics of interviewing, the experience could be uncomfortable or uninformative or both. Even those who speak to customers frequently may find that they are not getting as much useful information as they would like. If they have not been trained in techniques to uncover customers' underlying and future needs, they are likely to return with a list of bugs and short-term feature enhancement requests, many of which do not fall within their area of interest. Worse yet, they could easily wind up with information that misleads them about how to make their customers happy.
This tutorial is designed to teach people how to interview customers to learn about their underlying technical needs. It does so in three main ways: (a) by teaching people what information they should (and should not) try to collect through interviews, (b) by training them in the mechanics of interviewing, and (c) by showing them how to analyze the large volume of data they will collect during the interviews.
It is worth noting that this tutorial does not arise out of any specific CHI tradition for gathering user requirements (e.g. contextual inquiry, ethnomethodology, participatory design), but rather focuses on skills for gathering customer information that may be useful from any of those perspectives. To the extent that any of those techniques involve asking users questions, the skills taught in this tutorial would be useful.
Also note that the term "customer" is used broadly. The tutorial is equally relevant to researchers who are designing prototypes for users, to in-house developers who are designing applications for their employees, and to engineers who are designing products for the market.
The following sections describe each of these topics and discusses the means by which the information is taught in the tutorial.
The main point of this section is that interviews rely on people telling you about themselves, and people have significant limitations on the kinds of information they can accurately report about themselves. In particular, people are not good at predicting what they will like or will want. They are also not good at estimating how much they like a single option (as contrasted with comparing two options) [Glass and Holyoak, 1986; Anderson, 1985]. Yet the most common questions asked by many interviewers is "What features would you like?" and "What do you think of this feature/product we're thinking of building?"
Rather than asking people to consider hypothetical scenarios, interviewers should ask people to tell them about their current practices: what they are currently doing, how they do it, what they are trying to accomplish, what problems they face, how they handle those problems, etc. From this information, the interviewer can learn not only about customers' current needs, but also about opportunities for satisfying future needs. They can learn about limitations the customer is accepting without noticing, which are especially good indicators of useful products and features.
This section includes numerous demonstrations and an exercise to make it very clear just how poorly people can answer "what features would you like" questions accurately and how much better they respond to "what are you currently doing" questions.
An exercise is conducted in which participants prepare questions for an interview, which they will conduct later.
Many good and bad examples are provided from recordings of real interviews. The participants are also given the opportunity to conduct an interview, using the questions they prepared earlier.
During this section, the class participates in analyzing the data they collected from each other.
� Copyright on this material is held by the authors.