How we redesigned Yelp’s UX with remote usability testing

Default avatar.
May 20, 2015
How we redesigned Yelp’s UX with remote usability testing.
Usability testing, with real users, is one of the key areas that too many web designers skip. Often, there is little money in the budget for tests, but remote usability testing can be relatively cheap to perform, and the results can make a huge difference to a site’s conversions. To demonstrate the power of usability testing in design, we teamed up with UserTesting and Optimal Workshop to run some tests on Yelp as part of a redesign exercise. We specifically chose remote usability tests because they’re pretty quick and fairly affordable to run, compared to focus groups and other lab-based design tests. These tests were all unmoderated so that the users could interact with the Yelp website in the comfort of their own home for the most natural results. We encouraged them to think out loud and then recorded their reactions. User research isn’t about writing complex reports — it’s about asking the right questions and using the evidence to support design decisions. In this piece, we’ll look at how we chose our users, how we set up the tasks, and what the results were…

Selecting users

One of the first steps is figuring out who is testing your designs. In our experience, we’ve found that the demographics aren’t as important as behavior and familiarity with technology. How often do they use similar platforms, and how comfortable are they? [pullquote]demographics aren’t as important as behavior and familiarity with technology[/pullquote] Yelp has a huge user base (138 million unique monthly visitors, according to Yelp’s Q2 2014 numbers) so our redesigned site still needed to be familiar to the average current user — it wouldn’t make sense to alienate existing power users in favor of wooing first-time users. We didn’t focus on age, gender, income level, or experience using the Web, since Yelp users come from all backgrounds. Because we’re handling qualitative data, we did not need to worry about statistical significance. We followed industry best practices and ran our study with a total of 5 users which would help reveal around 85% of usability issues (good enough for the exercise). One of the tasks required users to login to an account. This meant we needed to create two segments for our test base: one with Yelp accounts (3 users), and one without (2 users). For the segment with Yelp accounts, we only selected participants who had been Yelp users for less than 6 months to further eliminate the likelihood that they would be power users. Finally, for the sake of simplicity, we only tested Yelp’s website on desktops, not on mobile devices. (If this was more than just an exercise, we would have tested the experience on as many devices as possible.)

Creating tasks for our users

Every usability test should start with the question, “What do we want to learn?” For us, we wanted to learn how semi-frequent Yelp users complete very common tasks (to identify which features were most important) and at least one less-common task (to test the intuitiveness of advanced features). We gave all users these types of common tasks:
  • Focused task Find a business based on very specific parameters
  • Open-ended task Find a business without being given very many guidelines
  • Highly specific task Look up a specific location to learn a specific piece of information
We wanted to learn when both user groups chose to search versus browse, how they interacted with filters, and how they made a decision about which business to visit. As for the less common tasks, we provided a different task for each user group. Since we had heard several complaints from registered Yelp users about Bookmark and Lists features, we asked registered users (Group 1) to save businesses for later reference. For unregistered users (Group 2), we asked them to find an event. Below are all of the tasks that we assigned to both groups. After each test, we asked users if they were able to complete the task successfully and the level of ease or difficulty (known as the Single Ease Question). Shared tasks (assigned to both groups)
  1. Imagine you need to reserve a private dining space for a group of 15 people. You are looking for an Italian restaurant with a classy ambiance. Your budget is about $20 per person. Try to find a restaurant near you that matches all of these needs.
  2. Imagine you are driving through Boise, Idaho, and your car starts to make a strange noise right as you’re about to stop for the night. Your passenger recommends 27th St Automotive. Use Yelp to find out if they are open at 8:00 pm on Tuesday.
Group 1 tasks (Yelp account holders)
  1. Imagine your best friend is having a birthday soon, and you’ll be planning a party. Find 10 bars or lounges near where you live that you would be curious to look into later for the party. Save them so that you can easily find them again on Yelp.
  2. Go to the place where you saved the 10 bars for your best friend’s party. Keeping his or her tastes in mind, choose one that would be a good match.
Group 2 tasks (not account holders)
  1. Use Yelp to find a new restaurant near you that you haven't been to yet. Spend no more than 5 minutes looking.
  2. Imagine you are looking for something fun and unique to do in your neighborhood this weekend. Try to find a concert, play, or other event using Yelp.
Once this was all done, it was time to start testing. It took around an hour to get the results before we could watch the user reactions and analyze the data.

Breaking down the usability data

To compare with the qualitative data we now had, we ran a quantitative test with 35 users with a closed card sort and a first-click test. You can learn more about the quantitative user tasks, but we’ll just summarize the top insights:
  • The Search bar was the starting point for almost all tasks. It was also the preferred backup option when users weren’t sure how to interact with the site UI (e.g. searching for “Bars” instead of clicking the category). Our redesign definitely needed to prioritize the Search bar.
  • The Events tab wasn’t noticeable. When asked to find an interesting activity, one user went to the Search bar while the other navigated through the Best of Yelp section. If we wanted users to actually interact with the Events feature on Yelp, we would need to make it easier to find.
  • The price categories weren’t clear. When given a budget to find a restaurant, some useres weren’t sure what the dollar signs meant. In our new design, we added price ranges to the symbols.
  • The filters aren’t prioritized correctly. People didn’t use 7 of Yelp’s 47 filters, and the most popular filters that arose in testing (such as “Accepts Credit Cards” and “Open Now”) take several clicks to access. Our redesign reorganizes filters into clusters of 4 for easier access.
  • Photos are a key part of the experience. When asked to find restaurants with a certain ambiance, users relied on photos the most. Our redesign makes Yelp more visual.
  • Bookmarking needs to be simpler. Currently, you can’t just save a restaurant or business straight from the search results — you need to visit each individual page to bookmark them. Our redesign lets you save a business with one click on the search results page.
To see how these insights reflected in the new design, you can play with the low fidelity Yelp prototype, and check out the final high-fidelity prototype. Featured image, testing image via Shutterstock.

Jerry Cao

Jerry Cao is a content strategist at UXPin — the wireframing and prototyping app — where he develops in-app and online content. To learn the methods, tools, and processes of UX prototyping, download the free The Guide to Prototyping.

Read Next

3 Essential Design Trends, December 2023

While we love the holidays, too much of a seasonal theme can get overwhelming. Thankfully, these design trends strike a…

10 Easy Ways to Make Money as a Web Designer

When you’re a web designer, the logical way to make money is designing websites; you can apply for a job at an agency,…

The 10 Most Hated Fonts of All Time

Remember when Comic Sans wasn’t the butt of the jokes? Long for the days when we actually enjoyed using the Impact…

15 Best New Fonts, November 2023

2023 is almost over, and the new fonts are still coming thick and fast. This month, we’ve found some awesome variable…

Old School Web Techniques Best Forgotten

When the web first entered the public consciousness back in the 90s, it was primarily text-based with minimal design…

20 Best New Websites, November 2023

As the nights draw in for the Northern hemisphere, what better way to brighten your day than by soaking up some design…

30 Amazing Chrome Extensions for Designers and Developers

Searching for a tool to make cross-platform design a breeze? Desperate for an extension that helps you figure out the…

Exciting New Tools for Designers, November 2023

We’ve got a mix of handy image helpers, useful design assets, and clever productivity tools, amongst other treats. Some…

The Dangers of Doomscrolling for Designers and How to Break Free

As a creative professional, navigating the digital realm is second nature to you. It’s normal to follow an endless…

From Image Adjustments to AI: Photoshop Through the Years

Remember when Merriam-Webster added Photoshop to the dictionary back in 2008? Want to learn how AI is changing design…

3 Essential Design Trends, November 2023

In the season of giving thanks, we often think of comfort and tradition. These are common themes with each of our three…

30 Obsolete Technologies that will Perplex Post-2000s Kids

Remember the screech of dial-up internet? Hold fond memories of arcade machines? In this list, we’re condensing down 30…