Project Description
In the Spring of 2018, I worked with Dr. Tharon Howard and Kimberly Jenerette on an usability study commissioned by Cengage for their MindTap Help system. With the goal to determine if there is a better way to organize the contents of the site, research will need to be done on how users are already finding the information on the current site and using that information to complete tasks. This research will collect data on users’ attitudes toward the use of screen captures in the documentation.
Audience
The audience for our report was the team at Cengage who would work on improvements to the help system. In speaking to them at the beginning of the semester, we determined that their main goal in having us complete this research was to settle an internal dispute about if screen captures and visuals were worth investing in. We worked to solve this problem for them by gathering data and also presenting a literature review determining that visuals are necessary to have a good user help portal.
Constraints & Goals
We decided our research questions at the beginning of the semester with help from the team at Cengage. They were:
- How can we improve the organization of the content within a Help System?
- Is there a way to make the table of contents easier to navigate?
- Can we create top-level headings that allow users to infer what will be underneath them?
- How do users respond to the use of screen captures in instructional documentation?
Project Overview
- Client: Cengage Textbooks
- Role: UX Researcher
- Duration: January 2019 - May 2019
Tools
- UX Methodologies
- Morae
- Premiere Pro
- Google Drive
- Card Sorting
- Think Aloud Protocol
- Scholarship and empirical research design methods
- Technological and media production literacies
- Professional communication processes, procedures and practices
- Kimberly Jenerette: Co-researcher
- Dr. Tharon Howard: Advisor
Competencies
Credits
Our biggest constraint turned out to be time. The IRB was a document required by Clemson for all students completing research involving human subjects. Since we could not begin the study until they approved the IRB, the study had to be completed quickly to stay within the semester’s time limit.
Process and Scholarship
Once approval for this study had been granted by Cengage and Clemson University’s IRB, we began contacting potential participants. The test design was predominantly based on a card sort study to determine the most intuitive menu navigation of the categories in the table of contents. Card sorting is a method used to help design or test the information architecture of a site. In the card sorting session, participants organize topics into categories that make sense to them and they will then label these groups. To conduct a card sort, researchers use note cards placed in front of the user with menu options written. Participants think-aloud while sorting, giving a clearer picture of their reactions and thought processes.
The test design was a collaborative effort among the test team and the Cengage development team. The following is from the report of the development of our test plan and method:
Co-Examiners/Participant Profiles
According to R.A. Virzi (1992), 80% of usability problems in a product are detected by four to five co-examiners and 90% are detected by ten. Citing the research of Nielsen and Molich (1990) and Virzi, Dumas and Redish recommend using three to five co-examiners in every research subgroup. Six individuals took part in our usability tests and the following considerations were made in selecting them.
- Availability and willingness. The acquaintances of Kim and Caylin were first asked if they would be interested and willing to assist in our usability study. Many of those who we reached out to were gracious enough to accept.
- Students of Clemson University. Participants were required to be currently enrolled at Clemson and taking a minimum of one class during the Fall 2019 semester.
- Demographic diversity. To represent the needs and skills of various students, we required that at least three majors be represented among our participants. We required that at least two participants were male and two were female. We required that between one and two participants represent a racial minority.
We conducted the test over two weeks from April 1, 2019, to April 11, 2019. After clearing task revisions with the Cengage Team, usability tests began with Caylin and Kimberly as co-examiners in each study.
We recruited six subjects for this test. All six users had to meet the following criteria:
- Users will be Clemson undergraduate or graduate students
- Users will be familiar with distance learning tools
Because users were to include graduate and undergraduate students from a variety of disciplines, prior use of Mindtap or the Mindtap help system was not included as criteria. Because users of the Mindtap system may not have experience with the Mindtap help system, this lack of experience was not required of the test participants.
Subject Demographics
The following table showed the summary of characteristics of the actual users in the testing of the Mindtap Student Help System:
Grade Level | 2 Freshman, 4 Graduate |
Average Age | 21 |
Sexual Identity | 4 Females, 2 Males |
Majors Represented | Writing Rhetoric and Media, Biochemistry, History, English, Environmental Engineering |
Experience with distance education | Average of 2.3 experiences |
Note that users came not from any single department. Also, users considered themselves proficient users of distance learning tools. During testing, users came across as confident users of the help system. Therefore, any difficulties that users had navigating the discussion area interface were not due to inexperience using such interfaces or help systems in general. However, none of our users claimed to have experience with Mindtap specifically.
Facility and Computer Specifications
Test participants completed their tests in the Usability Testing Facility (UTF) where their user screen and audio were captured. We included clips from the video screen captures in a file attachment to accompany this report. All participants completed their tests on a desktop computer in the facility.
Usability Study Methodology
We asked each participant to listen as I read the following instructions at the listed part of the study:
We would like to thank you for agreeing to take part in this study. We are consultants who have been asked to assess the usability of a web application. We are studying how people navigate user help pages. The application that we’re going to be using today is being revised and the next version is in the early stages. With your help, we hope to offer revision recommendations to the people who are working on these web-based tools. It is important that you understand that we are NOT testing you. Although it may seem like it there actually are no right or wrong answers in this study. We just want to see how you approach the tasks using the tool. Because we are trying to understand what works well and what doesn’t work well in the tool we’re going to be examining, it is important that you voice your thoughts as you try to perform these tasks. Harsh language and gestures do not offend us; in fact, they can offer a great deal of information about your experience. All that we request is that you speak out loud about what you are thinking and feeling as you are using the tool. As the study progresses, I may occasionally ask questions about what you are doing and thinking. In return, you may ask me any questions before, during or after the study I will try my best to answer your questions; however, be aware that I may be unable to answer some questions since my answers could possibly influence your opinions and perceptions of the web application. But I will answer your questions at the end of the study.
Card Sort Task
For the first portion of the study, we are determining the best way to organize the navigation menu of the website. Please assign each of the cards in this stack into five to seven categories. Please provide names for each folder group you create. Name each group with a word or words that describe the set of items it contains. Just stack the items into the groups. Think of where you expect to find these items on a website. There is no right number of items in each group, but make sure you think about how the items relate to each other. You’ll only have 10 minutes.
Think-Aloud Protocol Tasks
For the next part of the study, assume you are a student taking a course on MindTap, a distance learning tool. You have come to the help page because you are having trouble with some part of the site. Once again, as you’re using the system, it’s very important that you talk out loud and say what you’re reading on the screen and, more importantly, what you’re thinking about as you use the system. There is no need to write the answer since we are recording the screen, but there is a sheet of questions on the table for your reference.
Task 1: Where would you find the page that describes metrics of course engagement such as the number of logins?
Task 2: Where would find out what to do if you have purchased access but MindTap still says you haven't made the purchase?
Task 3: Where would you find out if your computer’s system is compatible with MindTap?
Task 4: Where would you find out what the different icons for an Audio Clip and an Audio Recording are? What is the difference between the two?
Task 5: Where would you find how to enable Flash on Internet Explorer?
Task 6: It has been two weeks and you haven't received your textbook yet despite receiving confirmation that it had shipped, where would you find out how to contact support?
Data Collection System
We collected six types of data for this usability test using think-aloud protocol analysis, a data collection tool (Morae), and pre- and post-study questionnaires:
- Average time spent on each task
Average time spent on each task was collected by Morae software and recorded by the examiner who was using the Morae software during the test.
- Average ease of use for each task
Following the declared completion of each task by the user, they were asked by the examiner proctoring the test how they would rate the ease of use for each task on a scale of Very Easy / Easy / Somewhat Easy / Somewhat Difficult / Difficult / Very Difficult. The examiner recorded the answer.
- Average confidence level in making correct navigational choices
We gauged the confidence level using a series of questions after each task and because of the think-aloud process.
- Think-aloud comments of most importance
Think-aloud was recorded using Morae recording software. The above data was collected and analyzed to uncover specific problems that people were having navigating the Mindtap Help System interface to perform the indicated tasks. Obviously, other problems were also identified with the interface during the test, but we will only mention briefly these in this report since they were not the primary area of research for this study.
- Card Sort Cards and Naming suggestions
Cards were written out by examiners and collected after participants had finished with their ordering maintained. We then entered data into a Google Spreadsheet for ease of access and comparison between participants.
- Average number of clicks per task
This too was captured using Morae as a means of analyzing the number of steps people were using to find the information that was needed for each task.
We created a final report containing all of our method, findings, and analysis in written form to Cengage at the end of the semester.
Reflection
Working with Cengage was my first client project of the MAWRM program. In working with them, I was happy to see how a real company-operated and what a usability study entailed. Throughout the process, I learned valuable information about usability and specifically how to make help portals more usable. This information was vital to the creation of my help portal for my merce Client project. In reflecting on the project, I believe we could have tried harder to recruit more participants. Although the trend was clear, I wish we could have lived up to the goal that we created at the beginning of the semester of finding and testing the portal on 15 students. While we rushed our timing, I am happy with the time that it allowed us to gather scholarship and create a solid research plan.