Plain Language Assignment

profileACE.M
Chp13.pdf

340

13Reviewing, Evaluating, and Testing Documents and Websites Understanding Reviewing, Evaluating, and Testing 341

Reviewing Documents and Websites 342 Revising 342

editing 343 ■ Guidelines: editing the draft 343

PRoofReading 345

Conducting Usability Evaluations 345

Conducting Usability Tests 348 the Basic PRinciPles of UsaBility testing 349

PRePaRing foR a UsaBility test 349

condUcting a UsaBility test 351 ■ ethics note: understanding the ethics of informed consent 352

inteRPReting and RePoRting the data fRom a UsaBility test 353 ■ document AnAlysis Activity: obtaining informed consent 354

WRiTER’s ChECklisT 355

ExERCisEs 356

CasE 13: Revising a document for a new Audience 356 and

13_MAR_7337_ch13_340-356.indd 340 10/28/14 4:19 PM

Understanding Reviewing, Evaluating, and Testing 13 341

THIS CHAPTER FOCUSES ON techniques for improving the usability of documents and websites. in technical communication, usability refers to how easily a person can use a document, site, or software program to carry out a task. in other words, usability measures how successfully a document achieves its purpose and meets its audience’s needs. more specifically, usability refers to five factors related to a person’s use of the item (nielsen, 2012):

• ease of learning: the time it takes a person to learn to use the item

• efficiency of use: the time it takes a person to carry out a task after learning how to do it

• memorability: a person’s ability to remember how to carry out a task

• error frequency, severity, and recovery: the number and severity of errors a person makes in carrying out a task, and the ease with which a person recovers from these errors

• subjective satisfaction: how much a person likes (or dislikes) carrying out the task

Understanding Reviewing, evaluating, and testing As a writer, you can improve the usability of documents and websites by reviewing, evaluating, and testing them.

• Reviewing refers to three techniques—revising, editing, and proofreading— for studying and changing your draft in order to make it easier to use. You have used these techniques in this writing course and in previous courses.

• evaluating refers to having other people help you by reading the draft and communicating with you about its strengths and weaknesses. You probably have had people help you evaluate some of your drafts in the past.

• testing refers to formal techniques of observing people and analyzing their actions as they try to use your draft to carry out tasks. You likely have not used testing before.

Figure 13.1 (on page 343) shows the relationships among reviewing, evaluat- ing, and testing.

How do you know whether you should go straight from reviewing to pub- lication or whether you need to have the draft evaluated and perhaps tested? Typically, you consider three factors:

• importance. If a document or site is important, evaluate and test as much as you can. For instance, an annual report is so important that you want to do everything you can to make it perfect. Your company’s website also is crucial. You keep evaluating and testing it even after it is launched. A routine memo describing a workaround for a technical problem is not as important. Review it yourself, and then send it out.

13_MAR_7337_ch13_340-356.indd 341 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 342

• time. Almost every document has a deadline, and almost every deadline comes too quickly. If the document is even moderately important and you have the hours, days, or weeks to evaluate and test it, do so.

• money. It costs money to evaluate and test drafts, including employee time and fees for test participants. If there is no good reason to spend the money, don’t.

Reviewing documents and Websites Reviewing a document or website is the process of studying and changing a draft to make it easier to use. Reviewing a document consists of three tasks: revising, editing, and proofreading. In carrying out these tasks, you will likely work from larger issues to smaller issues. You will first review the document as a whole (for scope, organization, and development), saving the smaller issues (such as sentence-level concerns) for later. That way, you don’t waste time on awkward paragraphs or sentences that you might eventually throw out.

REVisiNG Revising is the process of looking again at your draft to see if your initial assumptions about your audience, purpose, and subject still pertain, and then making any necessary changes. These changes can range from minor,

For more about audience and purpose, see Ch. 5.

The writer reviews the draft

The writer and others evaluate the draft The testing

team tests the draft

The writer (or others) publish the document or the site

or

or

a b c d e

FIgURE 13.1 Relationships among Reviewing, Evaluating, and Testing The solid lines represent the publication process. At point (a), the writer reviews the draft and then decides either to publish it as is (d) or to have it evaluated (b). If the draft is evaluated (b), it is next either published (d) or tested (c). After the draft is tested, it is published (d). The broken lines represent instances in which the draft might be sent back for further work. At point (e), a published document or website might be reviewed (a) and revised—partially or completely—to make it more usable.

13_MAR_7337_ch13_340-356.indd 342 10/28/14 4:19 PM

Reviewing Documents and Websites 13 343

such as adding one or two minor topics, to major, such as adding whole new sections and deleting others.

For example, imagine you are revising a set of instructions to help new sales associates at your company understand how to return unsold merchandise to the supplier for credit. Since you started working on the instructions last month, your company has instituted a new policy: sales associates must write statements to management analyzing the costs and benefits of returning the unsold merchandise versus discounting it heavily and trying to sell it. Now you need to do some additional research to be sure you understand the new policy, gather or create some examples of the kinds of statements sales associates will be expected to submit, write new instructions, and integrate them into your draft. You thought you were almost done, but you aren’t. It happens.

EDiTiNG Having revised the draft, you think it is in good shape. It meets the needs of its readers, it fulfills your purpose or purposes, and it covers the subject effectively, presenting the right information. Now it’s time for editing: going a little deeper into the draft.

For more about revising, see Ch. 3, pp. 52–54.

(continued)

Editing the Draft After you finish your draft, look through it to make sure the writing is clear and effec- tive. start with the big picture by answering these four questions:

is the design effective? documents and sites should look professional and attrac- tive, and they should be easy to navigate. Will your readers find it easy to locate the information they want? For more on design, see chapter 11.

does your draft meet your readers’ expectations? if, for instance, the readers of a report expect a transmittal letter, they might be distracted if they don’t see one. check to make sure that your draft includes the information they expect and looks the way they expect. Be especially careful if your document or site will be used by people from other cultures, who might have different expectations. For more on writing for multicultural readers, see chapter 5, page 99.

is your draft honest, and does it adhere to appropriate legal standards? have you presented your information honestly, without being misleading and without omitting information that might counter your argument? have you adhered to appropriate legal standards of intellectual property, such as copyright law? For more on ethical and legal issues, see chapter 2.

do you come across as reliable, honest, and helpful? check to see that your persona is fully professional: modest, understated, and cooperative. For more on persona, see chapter 8, page 184.

next, answer these four questions related to the organization and development of the draft:

have you left out anything in turning your outline into a draft? check your outline to see that all the topics are included in the document itself. or switch to

13_MAR_7337_ch13_340-356.indd 343 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 344

the outline view in your word processor so that you can focus on the headings. is anything missing? For more on the outline view, see chapter 3, page 46.

is the organization logical? your draft is likely to reflect several different organi- zational patterns. For instance, the overall pattern might be chronological. Within that pattern, sections might be organized from more important to less important. looking at the headings in the outline view, can you see the patterns you used, and do those patterns seem to work well? For more on organizational patterns, see chapter 7.

is the emphasis appropriate throughout the draft? if a major point is treated only briefly, mark it for possible expansion. if a minor topic is treated at great length, mark it for possible condensing.

are your arguments well developed? have you presented your claims clearly and emphatically? have you done sufficient and appropriate research to gather the right evidence to support your claims effectively? is your reasoning valid and persuasive? For more on conducting research, see chapter 6. For more on using evidence effectively, see chapter 8, page 176.

Finally, answer these four questions related to the verbal and visual elements of the draft:

are all the elements presented consistently? check to see that parallel items are presented consistently. For example, are all your headings on the same level structured the same way (for example, as noun phrases or as gerunds, ending in -ing)? And check for grammatical parallelism, particularly in lists, but also in traditional sentences. For more on parallelism, see chapter 10, page 227.

are your paragraphs well developed? does each paragraph begin with a clear topic sentence that previews or summarizes the main point? have you included appropriate and sufficient support to validate your claims? For more on para- graph development, see chapter 9.

are your sentences clear and correct? Review the draft to make sure each sentence is easy to understand, is grammatically correct, and is structured to emphasize the appropriate information. For more on writing effective sentences, see chapter 10.

have you used graphics appropriately? do you see more opportunities to translate verbal information into graphics to make your communication easier to understand and more emphatic? have you chosen the proper types of graphics, created effective ones, and linked them to your text? For more on graphics, see chapter 12.

Editing your draft thoroughly requires a lot of work. Naturally, you hope that once you’re done editing, you will never need to go back and retrieve that earlier draft. But experienced writers know that things don’t always go that smoothly. Half the time, when you throw out a sentence, paragraph, or section that you absolutely know you will never need again, you soon real- ize you need it again. For this reason, it’s smart to use your word processor’s change-tracking function and archive all the drafts of everything you write.

13_MAR_7337_ch13_340-356.indd 344 10/28/14 4:19 PM

Conducting Usability Evaluations 13 345

The easiest way to do this is to use a version number at the end of the file name. For example, the first draft of a Lab Renovation proposal is LabRenPropV1. When it comes time to edit that draft, open that file and immediately rename it LabRenPropV2.

PROOfREaDiNG Proofreading is the long, slow process of reading through your draft one last time to make sure you have written what you intended to write. You are looking for minor problems caused by carelessness or haste. For instance, have you written filename on one page and file name on another? Have you been consistent in expressing quantities as numerals or as words? Have you been consistent in punctuating citations in your list of works cited? Although your software can help you with some of these chores, it isn’t sophisticated enough to do it all. You need time—and willpower.

Look particularly for problems in word endings. For instance, a phrase such as “we studying the records from the last quarter” is a careless error left over from an earlier draft of the sentence. Change it to “we studied the records from the last quarter.” Also look for missing and repeated words: “We studied the from the last quarter”; “We studied the the records from the last quarter.”

How do you reduce your chances of missing these slips? Read the draft slowly, out loud, listening to what you have written and marking things that look wrong. After you fix those problems, go through the draft one more time, one line at a time, looking for more problems. Some instructors suggest read- ing the document backward—last page first, last line first, right to left—so you can focus on the individual words. If you can stand doing it, do it. You might also consider mixing up the pages of your document (ensuring that you’ve numbered them first) and reading them out of sequence so that you can focus on words and sentences without getting lost in the argument’s flow.

conducting Usability evaluations What is a usability evaluation? To evaluate the usability of a draft, you ask someone to study the draft, looking for ways to improve its usability. That person then communicates his or her impressions and suggestions, either in writing or in an interview.

You can perform usability evaluations of existing or prototype documents or sites. A prototype is a model that is built to simulate the look and feel of an item before it is produced commercially. In technical communication, a pro- totype is typically an early draft of a document, website, or software program. A prototype can range in sophistication from a simple drawing of a computer screen to a fully functioning system that looks exactly like a commercial product. Figure 13.2 (on page 346) shows an array of free blank templates that can be revised and used to create a home page of a website.

13_MAR_7337_ch13_340-356.indd 345 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 346

Most types of formal usability evaluations involve three categories of people in addition to the writer:

• Users. In technical communication, users are people who use a document, site, or program, usually as part of their jobs. They’re your primary audience, so they are an important source of feedback. They can be current or future users; they can be novice, experienced, or expert users. They are probably not people who work with you for the company that makes the product, because such people are likely to have specialized knowledge that would make them atypical.

• subject-matter experts (smes). An expert in the subject of the document, website, or software can be very useful in evaluating a draft. For instance,

FIgURE 13.2 a Website showing sample Templates Wix provides hundreds of different web templates, complete with images, geared to particular subjects, such as businesses, restaurants, and photography. This figure shows a few of the blank templates offered by the company. The user would download a blank template and plug information into it, creating a working prototype. With this prototype, the user could then evaluate how well the design works. Reprinted by permission of Wix.com.

13_MAR_7337_ch13_340-356.indd 346 10/28/14 4:19 PM

Conducting Usability Evaluations 13 347

a database engineer is presumably an SME in database software programs. This person probably could see more—and different—potential problems in a new database program than a typical user could. He or she might also be the person in charge of carrying out the usability evaluation.

• Usability experts. An expert in ergonomics, human-computer interaction, usability engineering, or cognitive psychology typically designs the usability evaluation. That is, he or she determines which questions to ask about the draft and how to most effectively and efficiently obtain answers. He or she might also carry out the evaluation. Or, a usability expert might evaluate a draft himself or herself.

Although there are many varieties, usability evaluations usually take one of five major forms:

• surveying or interviewing users. Evaluators survey or interview users to learn about the strengths and weaknesses of a document or site. These techniques sometimes reveal problems that can be fixed; for instance, you might learn that your users would really like to have a printed list of keyboard shortcuts to tape to the office wall. More often, however, these techniques provide attitudinal information; that is, they reveal users’ attitudes about aspects of using the draft.

• observing users. To understand how people use an existing document or site, evaluators go to their workplaces and observe them as they work. Observations can reveal, for example, that typical users are unaware of a feature that you assumed they used. This insight can help you see that you need to make that feature easier to find and use. Arrange the visit beforehand, and bring food to establish good will.

• interviewing smes and usability experts. An evaluator might ask an expert to study the draft for usability and then interview that person, asking general questions about the strengths and weaknesses of the draft or focused questions about particular aspects of the draft. One well-known version of an expert evaluation is called a cognitive walk-through, in which the evaluator asks an expert to carry out a set of tasks, such as signing up for RSS (rich site summary or really simple syndication) on a blog, a prototype, or an existing site. The evaluator watches and notes the expert’s actions and comments. Another version of an expert evaluation is called a heuristic evaluation. A heuristic is a guideline or desirable characteristic, such as that every page of a website should include an easy-to-find link to the home page. A heuristic evaluation, then, is an assessment of how well a draft adheres to a set of guidelines. After an expert conducts a cognitive walk-through or a heuristic evaluation, the evaluator interviews the expert.

• conducting focus groups. A focus group is a meeting at which a group of people discuss an idea or product. Typically, the people are current or prospective users. Let’s say your company sells a software program called

For more about interviewing and about writing questionnaires, see Ch. 6, pp. 136 and 138.

13_MAR_7337_ch13_340-356.indd 347 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 348

FloorTraxx, which helps people design custom floors. A focus group might consist of FloorTraxx customers and perhaps other people who have indicated an interest in designing custom floors for houses. The moderator would lead a discussion that focused on what the customers liked and disliked about the product, whether they were satisfied with the results, and what changes they would recommend in an updated version. The moderator would also seek to learn what information the prospective customers would need before deciding to purchase the product.

• Using a commercial usability service. Companies such as UserTesting .com offer usability testing of websites. You specify how many “users” you wish to have evaluate your site, their demographics (such as age, sex, web experience, and nationality), the context in which they are to use the site, a set of simple tasks they are to carry out, and a set of questions (such as “What do you like best about the site?”). You then receive a brief report from each person who evaluated your site and a video of the person thinking aloud while trying to carry out the tasks. Although such usability services claim that they are performing usability testing, in fact they are performing basic evaluations; real usability testing always involves real users. Real usability testing, as described in the next section, provides more detailed information because the testing team conducts the test in a controlled laboratory environment and can interact more extensively with test participants.

If your users include people from other cultures, be sure to include people from these cultures in your interviews and focus groups. If possible, use interviewers from the culture of the people you are interviewing. Vatrapu and Pérez-Quiñones (2006) have shown that people from other cultures are some- times reluctant to criticize a draft for fear of embarrassing the interviewer. When the interviewer is from the same culture, however, people are more forthcoming.

After completing any usability evaluation, you need to gather the impor- tant information that you learned and share it with others in your company through a presentation, a website, or a collection of documents on the com- pany intranet.

conducting Usability tests Usability testing draws on many of the same principles as usability evalu- ations. For example, in a test, you start by determining what you want to learn. You choose test participants carefully, and you repeat the test with many participants. You change the draft and retest with still more partici- pants. You record what you have learned.

The big differences between usability evaluation and usability testing are that testing always involves real users (or people who match the character-

13_MAR_7337_ch13_340-356.indd 348 10/28/14 4:19 PM

Conducting Usability Tests 13 349

istics of real users) carrying out real tasks, often takes place in a specialized lab, is recorded using more sophisticated media, and is documented in more formal reports that are distributed to more people.

This section covers four topics:

• the basic principles of usability testing

• preparing for a usability test

• conducting a usability test

• interpreting and reporting the data from a usability test

ThE BasiC PRiNCiPlEs Of UsaBiliTy TEsTiNG Three basic principles underlie usability testing:

• Usability testing permeates product development. Usability testing involves testing the document, site, or software rigorously and often to make sure it works and is easy to use. Prototypes, newly completed products, and products that have been in use for a while are all tested.

• Usability testing involves studying real users as they use the product. Unlike usability evaluations, which often involve experts, testing is done by real users, who can provide important information that experts cannot. Real users make mistakes that experts don’t make. One well-known example relates to computer software that included an error-recovery message that said, “Press Any Key to Continue.” The manufacturer received hundreds of calls from users who couldn’t find the “Any” key.

• Usability testing involves setting measurable goals and determining whether the product meets them. Usability testing involves determining, first, what the user is supposed to be able to do. For instance, in testing a wiki, the testers might decide that the user should be able to find the “Edit” function and then edit and save a sentence successfully in less than 30 seconds.

PREPaRiNG fOR a UsaBiliTy TEsT Usability testing requires careful planning. According to usability specialist Laurie Kantner (1994), planning accounts for one-half to three-quarters of the time devoted to testing. In planning a usability test, you must complete eight main tasks:

• Understand users’ needs. To understand users’ needs, companies conduct focus groups, test existing products, have experts review the product, and conduct on-site interviews and observations of real users in the workplace.

• determine the purpose of the test. Testers can test an idea even before the product is designed, to see if people understand it and like it. Or they can test a prototype to see if it is easy to use, or a finished product to see if it needs any last-minute improvements.

For more about your audience’s needs, see Ch. 5.

13_MAR_7337_ch13_340-356.indd 349 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 350

• staff the test team. Extensive programs in usability testing involve many specialists, each doing one job. Smaller programs involve only a handful of people, each doing many jobs. For instance, a testing team might include an SME on the product, who can suggest workarounds if necessary; a test administrator, who administers the test to participants; a note taker, who fills out the evaluation forms and records important comments users make; and a videographer, who operates the recording equipment.

• set up the test environment. A basic environment includes a room for the test participant and another room for the test observers. Figure 13.3 presents a photograph of a usability lab.

• develop a test plan. A test plan is a proposal requesting resources; it describes and justifies what the testers plan to do.

• select participants. Testers recruit participants who match the profile of the intended users. Generally, it is best not to use company employees, who might know more about the product than a real user would.

• Prepare the test materials. Materials for most tests include legal forms for the user to complete, an orientation script to help the participant understand the purpose of the test, background questionnaires, instructions for the participant to follow, and a log in which testers record data during the test.

• conduct a pilot test. A pilot test is a usability test for the usability test. A pilot test can uncover problems with the equipment; the document, site, or software being tested; the test materials; or the test design.

For information about proposals, see Ch. 16.

FIgURE 13.3 a Usability lab The two people in the foreground are in the observation room, where they are monitoring the woman performing the usability test in the testing room. From Gwinnett Business Journal, by permission of Tillman, Allen, Greer.

13_MAR_7337_ch13_340-356.indd 350 10/28/14 4:19 PM

Conducting Usability Tests 13 351

CONDUCTiNG a UsaBiliTy TEsT The testing team has to plan the test carefully and stay organized. Typically, the team creates a checklist and a schedule for the test day, specifying every task that every person, including the test participant, is to carry out. Con- ducting the test includes interacting with the test participant both during the formal test and later, during a debriefing session.

interacting with the Test Participant Among the most popular tech- niques for eliciting information from a test participant is the think-aloud test, in which the participant says aloud what he or she is thinking while using a document or a website. Consider the earlier example of FloorTraxx software for designing custom floors. In planning to test the software, you would first create a set of tasks for the participant to carry out:

• Calculate the area of a floor.

• Calculate the number of tiles needed for a project.

• Estimate the amount of adhesive needed for a project.

• Generate the bill of materials needed for a project.

• Calculate the cost of materials and number of hours of labor for a project.

As the participant carries out each task, he or she thinks aloud about the process. Because this process might make the test participant feel awkward, the test administrator might demonstrate the process at the beginning of the session by thinking aloud while using one of the features on a cell phone or finding and using an app on a tablet.

While the test participant thinks aloud, a note taker records anything that is confusing and any point at which the test participant is not sure about what to do. If the test participant gets stuck, the administrator asks a leading ques- tion, such as “Where do you think that function might be located?” or “What did you expect to see when you clicked that link?” Questions should not take the user’s knowledge for granted or embarrass the test participant for failing a task. For example, “Why didn’t you click the Calculate button?” assumes that the user should have seen the button and should have known how to use it.

In addition, questions should not bias the test participant. When testers ask a participant a question, they should try not to reveal the answer they want. They should not say, “Well, that part of the test was pretty easy, wasn’t it?” Regardless of whether the participant thought it was simple or diffi- cult, his or her impulse will be to answer yes. Usability specialists Joseph S. Dumas and Janice Redish recommend using neutral phrasing, such as “How was it performing that procedure?” or “Did you find that procedure easy or difficult?” (1999). In responding to questions, testers should be indirect. If the participant asks, “Should I press ‘Enter’ now?” they might respond, “Do you think you should?” or “I’d like to see you decide.”

13_MAR_7337_ch13_340-356.indd 351 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 352

To ensure that the test stays on schedule and is completed on time, the test administrator should set a time limit for each task. If the test participant cannot complete the task in the allotted time, the administrator should move on to the next task.

E T H I C S N OT E

UNDERsTaNDiNG ThE EThiCs Of iNfORMED CONsENT For legal and ethical reasons, organizations that conduct usability testing—especially tests that involve recording the test participant’s behavior—abide by the principle of informed con- sent. Informed consent means that the organization fully informs the participant of the condi- tions under which the test will be held, as well as how the results of the test will be used. only if the participant gives his or her consent, in writing, will the test occur.

When you obtain informed consent for tests that involve recording, be sure to do the follow- ing six things:

• Explain that the test participant can leave at any time and can report any discomfort to the testing team at any time, at which point the team will stop the test.

• Explain that a video camera will be used and, before the recording begins, ask for permis- sion to record the test participant.

• Explain the purpose of the recording and the uses to which it will be put. If, for example, the recording might be used later in advertising, the test participant must be informed of this.

• Explain who will have access to the recording and where it might be shown. A participant might object to having the recording shown at a professional conference, for example.

• Explain how the test participant’s identity will be disguised—if at all—if the recording is shown publicly.

• Explain that the test participant will have the opportunity to hear or view the recording and then change his or her mind about how it might be used.

Debriefing the Test Participant After the test, testers usually have questions about the test participant’s actions. For this reason, they debrief the participant in an interview. The debriefing is critically important, for once the participant walks out the door, it is difficult and expensive to ask any further questions, and the participant likely will have forgotten the details. Consequently, the debriefing can take as long as the test itself did.

While the participant fills out a posttest questionnaire, the test team quickly looks through the data log and notes the most important areas to investigate. Their purpose in debriefing is to obtain as much information as possible about what occurred during the test; their purpose is not to think of ways of redesigning the product to prevent future problems. Usability spe- cialists Jeffrey Rubin and Dana Chisnell (2008) suggest beginning the debrief- ing with a neutral question, such as “So, what did you think?” This kind of question encourages the participant to start off with an important sugges- tion or impression. During the debriefing session, testers probe high-level concerns before getting to the smaller details. They try not to get sidetracked by a minor problem.

13_MAR_7337_ch13_340-356.indd 352 10/28/14 4:19 PM

Conducting Usability Tests 13 353

iNTERPRETiNG aND REPORTiNG ThE DaTa fROM a UsaBiliTy TEsT After a usability test, testers have a great deal of data, including notes, ques- tionnaires, and videos. Turning that data into useful information involves three steps:

• tabulate the information. Testers gather all the information from the test, including performance measures, such as how long it took a participant to complete a task, and attitude measures, such as how easy the participant found the task.

• analyze the information. Testers analyze the information, concentrating on the most-important problems revealed in the test and trying to determine the severity and the frequency of each one.

• Report the information. Writing a clear, comprehensive report often leads the testers to insights they might not have achieved otherwise.

Although usability testing might seem extremely expensive and difficult, testers who are methodical, open-minded, and curious about how people use their documents or websites find that it is the least-expensive and most- effective way to improve quality.

13_MAR_7337_ch13_340-356.indd 353 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 354

NASA, the U.S. space agency, uses this consent form in its usability testing. The questions below ask you to examine this document in light of the guidelines for informed consent (p. 352).

1. Which concepts of an effec- tive informed-consent form, as described in the Ethics Note on page 352, does this form include?

2. Which concepts of an effec- tive informed-consent form does this form not include?

3. Are any provisions in this form potentially unclear?

Obtaining Informed Consent

Understanding Your Participation Please read this page carefully.

You have agreed to participate in a usability study that will evaluate [system]. By participating in this study, you will help NASA to improve [system] in future rede- signs. Our team will observe you and record information about how you work with the [system]. We will also ask you to fill out questionnaires about your experience and follow-up questions. We will record your comments and actions using written notes and video cameras.

Our team will use the data from your study session, including videotapes, solely for the purposes of evaluating the [system] and sharing results of these evaluations with [the study sponsor]. Your full name will not be used during any presentation or in the results of this study.

By signing this form, you give permission for NASA to use:

•  Your recorded voice:     Yes No •  Your verbal statements:     Yes No •  The videotape of your session:   Yes No

If you need a break at any time, please inform the study facilitator immediately. If you have questions about how the session will proceed, you may ask them at any time. You may withdraw from your study session at any time.

Receipt for [Incentive]

Please acknowledge that you have received from [NASA, or study sponsor] [the exact amount of money, or describe the nonmonetary incentive, if it is merchan- dise] for your participation by signing below. Your acceptance of this [incentive] does not constitute employment by [NASA, or study sponsor].

I have received my [incentive].

If you agree with these terms, please indicate your agreement by signing below:

Signature: __________________________________

Print Name: __________________________________

Date: ____________

D O C U m E N T A N A lyS I S AC T I v I T y

Source: National Aeronautics and Space Administration, 2010: www.hq.nasa.gov/pao/portal/usability /process/utPlanning.htm.

13_MAR_7337_ch13_340-356.indd 354 10/28/14 4:19 PM

Writer’s Checklist 13 355

W R I T E R ’S C H E C k l I S T

Revising, Editing, and Proofreading

Did you think about how your audience, purpose, and subject might have changed since you planned and drafted your document or site? (p. 342)

In editing your draft, did you check to see that

the design is effective? (p. 343)

the draft meets your readers’ expectations? (p. 343)

the draft is honest and adheres to appropriate legal standards? (p. 343)

you come across as reliable, honest, and helpful? (p. 343)

you have not omitted anything listed on your original outline? (p. 343)

the organization of the draft is logical? (p. 344)

the emphasis is appropriate throughout the draft? (p. 344)

the arguments are well developed? (p. 344)

the elements of the draft are presented consistently? (p. 344)

the paragraphs are well developed? (p. 344)

the sentences are clear and correct? (p. 344)

the graphics are used appropriately? (p. 344)

you archived your earlier draft before you started, using a logical file-naming system? (p. 344)

Did you proofread your draft carefully, looking for minor problems such as

inconsistent spelling and punctuation? (p. 345)

incorrect word endings? (p. 345)

repeated or missing words? (p. 345)

Usability Evaluations Did you, if appropriate,

survey or interview users? (p. 347)

observe users using your existing document or site? (p. 347)

interview SMEs and usability experts? (p. 347)

conduct focus groups? (p. 347)

use a commercial usability service? (p. 348)

Usability Tests Did you prepare for the usability test by

making efforts to understand your users’ needs? (p. 349)

determining the purpose of the test? (p. 349)

staffing the test team? (p. 350)

setting up the test environment? (p. 350)

developing a test plan? (p. 350)

carefully selecting participants? (p. 350)

preparing the test materials? (p. 350)

conducting a pilot test? (p. 350)

Did you conduct the usability test effectively by

interacting appropriately with the participant? (p. 351)

obtaining informed consent? (p. 352)

debriefing the participant? (p. 352)

Did you interpret and report the test data by

tabulating the information? (p. 353)

analyzing the information? (p. 353)

reporting the information? (p. 353)

13_MAR_7337_ch13_340-356.indd 355 10/28/14 4:19 PM

REViEWiNG, EValUaTiNG, aND TEsTiNG DOCUMENTs aND WEBsiTEs13 356

C A S E 1 3 : Revising a document for a new audience

You work at your university’s health center, where your supervisor is looking for ways to provide students with more information on obesity, a topic they have been inquiring about increasingly. She has located a fact sheet from the National Institutes of Health that contains information concerning a specific diet about which students have expressed interest. The fact sheet, however, was not developed for a college-student audience, and your supervisor would therefore like you to try revising it with your peers in mind. To get to work, go to “Cases” under “Additional Resources” in Ch. 13: macmillanhighered.com /launchpad/techcomm11e.

E x E R C I S E S For more about memos, see Ch. 14, p. 372.

1. Edit and proofread the following passage. Be prepared to share your work with the class.

Here are a list of the questions you shouldn’t be asked by a perspective employer: What is or was your spouse’s name or job? Has a Workers Compensation claim been filed on your behalf? Were you ever injured on the job. Do you have any physical empairments that prevents you from performing the job for which your applying? Have you ever been arrested? If yes, what for? What is your hair/eye color? Your height/weight? Have you ever been hospitalized? If so, why. Have you ever been treated by a psyhiatrist or psychologist? If so, for what condition? Is there any health-related reasons you may not be able to preform the job for which you’re applying? How many days were you absent from work because of illness? Are you now taking any drugs? Have you ever had a problem with for drug adiction or alcoholism?

2. Contact local manufacturing companies and computer- hardware and -software companies to see whether any of them perform usability testing. Interview a person who performs usability testing at one of these

organizations. Then write a 1,000-word memo to your instructor describing how the process of conducting usability testing at this organization differs from that described in this chapter.

3. If a local company conducts usability testing, see whether you can become a test participant. After the test, write a 1,000-word memo to your instructor describing the experience, focusing on what you learned about usability testing.

4. TEaM ExERCisE Form a group of four or five students, and conduct an informal usability test for assembling or using one of the following products:

a. a piece of computer hardware, such as a printer

b. a piece of software (or a portion of a piece of software)

c. a document that accompanies a piece of software (or a portion of one)

d. a piece of equipment used in your major field of study

e. a smartphone, Bluetooth headset, or similar product

Submit a brief usability report to your instructor.

13_MAR_7337_ch13_340-356.indd 356 10/28/14 4:19 PM