Just a week earlier, you started with a big challenge and big ideas. Now it's time to test your solution with real users from the product's target audience. Their reactions and feedback will show exactly what's working and what isn't with the prototype.

Users don't show up on the testing day

What this looks like

😑  Users skip their scheduled tests without any warning.

After going 4 days of sprint sessions, booking testers and creating a great prototype, it's time for user testing. But what happens if some of your user testers don't show up?

Why it happens

🏃‍♂️  People have busy lives, or the tests have weak incentives.

One reason that people skip user tests is that the incentive for doing them wasn't strong enough. This can happen when you recruit friends and ask them to give up an hour of their time out of kindness.

Another reason is that most people are busy and have a lot of things going on. User tests can feel like something that they can skip without consequences if other priorities pop up.

How to prevent it

💸  Offer a great incentive to make user tests worthwhile.
🤞  Book two extra testers for just in case.

First, make sure your user tests have an appealing incentive. For example, we give user testers a $50 Amazon voucher for participating. This incentive can be different, based on who you're interviewing, and it can be something other than money. Just make sure that your incentive is something that people actually want.

Second, book two extra user testers in case some people bail despite your awesome incentive. If you get any no-shows, you'll still be covered. If no one skips their user test, you can either cancel your extra testers or interview them for bonus insights.

You recruit users who aren't your target audience

What this looks like

😔  You spend time doing user tests with the wrong users.

When we first started running Design Sprints and user tests, this happened to us. We'd get on a call and start a user test, only to realise that the user wasn't in the product's target audience.

This is demoralising and frustrating. It makes you wonder, "Why am I even interviewing this person? It's a waste of my time, their time and our incentive."

Why it happens

🤯  There's a poor or nonexistent screening process for user testers.

There's a simple reason for this issue — poor screening. Perhaps anyone could book a user test by just selecting a Calendly link, or perhaps the form to sign up was little more than a name and email.

How to prevent it

📑  Create an application with relevant profiling questions.
📞  Do a quick screening call, if possible.

It's important to make sure you're screening for the right user testers. Recruiting the wrong people can result in irrelevant insights, waste your time, and even defeat the purpose of the sprint.

Selecting user testers should involve two key steps:

  1. Ask potential testers to fill a form. This should include enough details to show if the person is in your target audience. You can ask questions about demographics, profession, interests, behaviours, etc.
  2. If possible, do a 5-minute screening call with shortlisted testers. You don't need to schedule this. Just make sure the form asked for each person's phone number, and give them a quick call. Before you dive into a 45-minute user test, a 5-minute call can give you comfort that you've got the right person.

Users like everything on the prototype

What this looks like

😍  People only had positive feedback during user tests.

Lo and behold, the user tests went surprisingly well! When you're presenting the results of your user tests, you're excited to report that everyone only said nice things. But is that actually a good thing?

Why it happens

😢  You made people think that criticising the prototype meant criticising you.
😡  You got defensive during the user tests.

There are a couple things you may have accidentally done to encourage positive feedback.

There are a couple things you may have accidentally done to encourage positive feedback.

First, you may not have separated yourself and your emotions from the prototype. You may have started the user tests by talking about how amazing the product is and how much work you put into it. This could even be followed by demoing everything in the product and talking about how you designed it. If you build up the idea that you care about the prototype and worked really hard on it, people will feel like they shouldn't criticise it.

Second, you may have gotten defensive and argued back during user tests. For example, if someone critiqued a feature, you may have responded with "This is the reason we did that" or "Here's why that will work". These types of answers signal to user testers that criticism isn't welcome, and they'll stop sharing their feedback.

How to prevent it

📏  Create distance between yourself and the prototype.
✨  Don't hype up the prototype.
⁉️  Stick with open-ended questions.

Negative feedback is important for finding problems and improving the prototype, so make sure that you invite criticism during user tests.

Negative feedback is important for finding problems and improving the prototype, so make sure that you invite criticism during user tests.

First, create some distance between yourself and the prototype. Tell the user that you didn't create it (even if you did), so they won't be insulting you if they don't like it. Explain that the more they critique it, the more honest their feedback, the more helpful the session will be.

Second, don't demo or hype up the prototype. There's no need to walk through the features, because you may be inclined to explain them in a positive light. Just set a bit of context about the product, then let the user play with it. Observe them, but don't critique what they do or say.

Third, if you need to ask them something, stick with open-ended questions. Never ask leading questions like "Do you like this?" You're already suggesting the answer within your question. Instead, ask questions like "What do you think about this?"

🤩  Pro Tip:

User testers aren't always right! Sometimes they'll misunderstand your product or think about an irrelevant use case. You can clarify a bit, but don't get defensive or discourage their train of thought.

Ultimately, it doesn't matter if a user tester's critiques are wrong. You don't have to act on their feedback. They're just suggestions. It's better to keep ideas flowing, rather than correcting observations that you don't agree with.

A user hates your prototype

What this looks like

🤬  Someone doesn't have a single nice thing to say about the prototype.

A user opens the prototype and they launch straight into criticism. It seems like they have nothing nice to say about the product, and you feel like you're under attack.

Why it happens

😳  The user is very honest or poorly chosen.

If a user tester wasn't chosen well, they may not be in the product's target audience. That means they'll completely misunderstand and dislike the product since it doesn't make sense for them.

Other user testers may understand the product well and have valuable insights. But if they are naturally blunt and critical, their feedback may sound harsh and overly negative.

How to prevent it

🙈  Gently clarify misunderstandings, or just ignore them.
😅  Sit back and let the criticism roll in. It's good for you!

In the first case, when someone is off track or misaligned with the product, try giving a bit of explanation. If someone doesn't understand the problem the product is solving, explain it. If they're not clear on the intent of a feature, clarify it. As long as you're not defensive, this could help set them on the right track. And if it doesn't, just nod along and toss their results once you finish.

In the second case, when someone justifiably dislikes your product, consider yourself lucky! Critical users can result in the best user tests, since they give feedback that no one else has dared or bothered to give. Their insights can be the deciding factor in how to move forward. If you discourage their input or become defensive, you'll lose their valuable perspective.

Regardless of its cause, sit back and let bad feedback happen. It's either super helpful or meaningless, and trying to stop it will only cause more trouble.


Download PDF

Keep these 28 tips handy

Thank you for downloading our guide to the worst Design Sprint goof-ups!Your file is on its way to your email address.