This past summer, I had a fantastic time working at Western Digital as a UX Design & Research Intern on the My Cloud Home team. The My Cloud Home is a hard drive that stays connected to a router at home, so that its contents can be accessed from anywhere using a desktop, laptop, or smartphone. Using the mobile app, users can access, upload, and share their photos, videos, and documents on-the-go.
I worked closely with UX designers, product managers, engineers, marketers, and even executives to create a world-class user experience for the newest generation of the My Cloud Home mobile app.
After purchasing their My Cloud Home device, customers can complete setup by following the included Quick Install Guide (QIG). After they setup the hardware, they are invited to download the mobile app to access their data on-the-go.
Our product managers had the following assumptions about the existing onboarding experience:
- People will think the onboarding takes too long or requires too many steps.
- People will be confused by the Quick Install Guide (QIG) because it has no labels or written directions
- People won't know to do after they finish setup or when the onboarding process is complete. ("What now?")
My challenge was to test those assumptions, offer viable solutions in the form of design deliverables, and support my decisions with qualitative and quantitative data.
- Create and run remote usability studies
- Brainstorm and design new iterations for both screen and physical mediums
- Collaborate with PMs and engineers to align on product goals and vision
- Watch and annotate usability test videos
- Analyze qualitative and quantitative data
- Present findings and make product recommendations
- Generate report of the study and project
I pursued two different types of research to inform my design decisions during this project.
The first phase was exploratory research, where I gathered preliminary data to either confirm or disconfirm the previous assumptions. The tests were in-person and moderated in my cubicle using everything that comes in the package of a My Cloud Home Duo device, a Netgear WiFi Router, and an iPhone 6. I sat by the participant while one of the UX designers (and my mentor over the summer) took notes. I scheduled five test sessions and conducted four of them, which is sufficient enough to provide us with initial insight. Participants of the moderated usability tests were part of a convenience sample who were selected from the cafeteria of the Western Digital Milpitas office.
The second was a usability study in which I compared and evaluated the effectiveness of the existing flow vs. the new flow that I designed. These were remote, unmoderated usability tests conducted on UserTesting.com. Each test consisted of 10 participants, making up 20 users in total.
This exploratory research was conducted to assess the following:
- Can people successfully set up their My Cloud device?
- Where in the flow do people experience confusion or uncertainty?
- Do people understand the QIG?
- What will people do after they finish setting up their device?
QIG — Front
QIG — Back
Phase 1 Research Findings
After each moderated session, I asked participants to rate their experience based on several criteria on a 5-point scale. Here's what I found:
|Participant 1||Participant 2||Participant 3||Participant 4|
|Difficulty of setting up device? [1 - Very Difficult to 5- Very Easy]||4||3||n/a*||4|
|Confidence of setting up device? [1 – Not at all confident to 5- Very Confident]||5||4.5||n/a*||5|
|Length of time to set up device? [1 - It took too long to 5 - It took the perfect amount of time]||5||5||n/a*||5|
*Participant 3 could not proceed past the “Get Mobile App” screen because of a bug that prevented them from logging in.
My mentor and I outlined the qualitative data from the study, including what participants liked, disliked, and other areas of concern. We also brainstormed and created a diagram for an improved flow that addresses the flaws I found.
A key finding of the moderated tests was the trouble caused by the sheet of paper titled “Instruction Manual Important Safety Instructions,” which participants sometimes ignored or expressed annoyance with due to the small, unreadable text. One participant incorrectly assumed that the QIG was a visualization of this sheet. Most participants were confused about the difference between the Safety Instructions and the QIG (which helps them set up their device) because the Safety Instructions has the words "Instruction Manual" in its title, which is inaccurate and misleading.
Across all but one test, participants said that all the steps were clear and easy to understand until step 4, which made little sense to them (navigating to the website and signing up for a My Cloud account). They fell into one of two camps: they either went to the app store to search for the My Cloud app (which is not the worst error, but users cannot sign up through the native app at this time) or they said they would visit the support website (which means the QIG has failed).
These findings served as preliminary insights that informed the changes I made to the QIG and the onboarding flow. In the remote testing phase, I used the existing flow for the control test and the new flow for the experimental test.
Will people have a better understanding of how to set up their device, how to download the app, and what to do after they sign in when using the new QIG versus the existing QIG? When using the new flow versus the existing flow?
The existing QIG walks customers through how to set up the hardware and invites them to download the mobile app to backup and manage their data on-the-go. This QIG displays the steps using pictures only and a large space for displaying the support website for customers who need help setting up their device.
The new QIG includes text labels for each step to eliminate any doubt that imagery by itself may create. I modified the smartphone illustration, which some participants mentioned looked more like a tablet than a phone, and added elements to make the browser more obvious. There is also less real estate dedicated to the support website and legal text in order to deemphasize the help option. If we have done our job correctly, people should not have to contact support after viewing these instructions.
Because we were so close to the My Cloud Home release date, the design for the QIG had already been finalized. However, it was still possible for me to make a case for improvements in the QIG for the next generation of the device, which is what I did.
After signing up for a My Cloud Home account, the browser site will begin searching for the user's device on the wifi network. When the device is successfully found, this is the flow they go through:
When a device cannot be found on the wifi network, users go through a fallback flow that troubleshoots the problem and attempts to search for the device again. If the device still cannot be found, users may enter the device code manually.
Existing Flow — Negative Fallback Flow
Participants who went through the existing flow were presented with negatively phrased copy and words that highlight the error in subtle ways.
New Flow — Positive Fallback Flow
Because users only experience this flow when there has been a failure, I wanted to deemphasize the failure and frame the situation positively by using encouraging words. I also made the troubleshooting steps into questions that users can answer by tapping through the buttons.
In addition, I added the secondary option labeled "The light is not on" on the final troubleshooting screen because if this is ever the case, the problem can only be resolved with a call to customer support.
The following changes were made to the copy:
|Page Title||We Couldn’t Find Your Device||Let’s Find Your Device||Deemphasizes the error/the fact that this is a fallback option|
|Page Title||Checking Your Device||Let’s Find Your Device||No need for title change between these two pages.|
|Body||Since We’re Having Trouble Finding Your Device…||To Help Us Find Your Device…||Deemphasizes the error/the fact that this is a fallback option|
|Body||…make sure your device…||…is your device…?||Phrases instructions into a question so people can answer by tapping the button (see next row)|
|Button||Next||Yes/Yes, Find My Device (secondary link: The light is not on)||Buttons turn into responses to the questions presented. Option now available when light is not on.|
Phase 2 Research Findings
Compared to the control group, participants in the experimental group left more favorable ratings across a number of criteria, such as easiness and confidence. See the chart for all ratings (each score is out of 5).
|Criteria||Mean Rating in Existing Flow||Mean Rating in New Flow||Rating Change|
|Confidence in knowing how to set up device after looking at the card. [Not at all confident to Very Confident]||3.8||4.7||0.9|
|Helpfulness of the card to understanding of how to set up device. [Not at all helpful to Very Helpful]||3.8||4.8||1.0|
|Easiness in setting up the device. [Very Difficult to Very Easy]||4.3||4.8||0.5|
|How closely the process matched expectations. [Not at all what I expected to Exactly what I expected]||4.1||4.8||0.7|
|The amount of time it took to create an account and download the mobile app. [It took too long to It took the right amount of time]||4.3||4.9||0.6|
Compared to those using the existing flow, participants who used the new flow found it to be 23.68% easier to complete and thought that the QIG was 26.32% more helpful in setting up their device.
I also wanted to establish whether participants registered the fallback state as an unexpected state or that an error had occurred. When using the existing flow, only one participant pointed out that the fallback state may be due to an error, while the remaining participants made no distinction between error and intended flow. This tells us that while people rate the new design as closer to their expectations than the existing one, it is uncertain as to whether or not it has to do with the positively framed copy.
After finishing the onboarding experience, I asked participants to show us what they would do next. Their responses are displayed in the pie chart below. What is notable about these insights is that none of the participants expressed feeling lost or disoriented. Those who had a goal in mind knew exactly how to accomplish it and those who did not were satisfied with exploring the app and discovering things on their own.
These insights tell us that people will know what to do after onboarding, and no further action is required in this area.
Something that I hadn’t anticipated was how many participants would end their flow prematurely without realizing that they were not finished. Of the twelve people that exited the flow early, six of them dropped off at the “Get the Mobile App Screen.”
This issue needs further testing, but the reason may be that the existing App Store page looks too much like an ad (which people tend to be blind to) or there needs to be a more obvious link or button that affords tapping. Alternately, we can remove the step altogether and redirect to the App Store (and inform people of such) after a certain number of seconds, potentially decreasing the chance of skipping this important step.
To the right are action items plotted inside an Effort-Impact matrix, with the lowest effort items located in the left half of the matrix and the highest impact items in the top half of the grid. Hence, items that fall inside the blue square should be prioritized because they require the least effort and have the highest impact.
|• Remove “Instruction Manual” from the title of the Safety Instructions sheet||• All users in moderated tests were either confused or annoyed by the sheet due to its misleading title
|• Add text labels to the QIG instructions for our next generation product
• If possible, improve illustrations
|• Onboarding was 23.68% easier to complete and QIG was 26.32% more helpful, according to participant ratings|
|• After signup, redirect people to the App Store instead of simply providing a button||• Half of all participants who did not complete onboarding dropped off at “Get the Mobile App” screen|
|• Framing fallback states positively by tweaking copy||• Not enough participants made the distinction between fallback flow and intended flow, but they rated the new design as closer to their expectations than the existing one
• Insights are uncertain without more data and A/B tests