9

Why it’s never a good idea to test your idea

 1 year ago
source link: https://www.mindtheproduct.com/why-its-never-a-good-idea-to-test-your-idea/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Product Managers are bombarded with feature requests and ideas every day. Even in companies with profound product knowledge, they’re consistently asked to test those ideas (sometimes carelessly called designs) in their daily routine. Here is why you shouldn’t do so as a product manager.

Preface

The following story is based on true events, only the names have been ignored.

You get to work early in the morning to have an unplanned meeting with a key stakeholder. He texted you last night to talk about a revolutionary idea he found during his experience with another product. After half an hour or so, he finishes his sentences waiting for you to nod but you are puzzled and unsure what to say as you can’t relate the idea to any of your customers’ problems. For the next five minutes, you try to explain the risks involved but he interrupts you: “Undoubtedly, it would move the needle. At least, it is worth testing. Okay? Build an MVP or something.”

In the following days, you discuss the idea with your teammates, turn the wireframes into a prototype with the product designer’s help and prepare some documents (or a few user stories if you follow Scrum practices) to communicate the exact requirements to developers. Traditionally, you choose the A/B test approach to have statistical significance on your results. Hence, a handful of new backend endpoints and a bunch of client-side codes must be developed. The team estimation is 5-7 workdays. After exactly two weeks from your initial meeting with the stakeholder, Google Optimize results appear on your screen, stating a 3% change in conversion rate. Everything seems fine except that it’s a down lift instead of an uplift! You wait for another two weeks (making a total of 28 days) according to Google Optimize instructions, but the situation is worsening.

And here begins the nightmare.

The stakeholder shows up saying “It would have worked if you just ….” and proposes an incremental development. The product designer starts iterating on her design to make another prototype. And the engineers ask you what to do now.

What happened?

The product manager in the story above made several (unfortunately common) mistakes which resulted in a major hassle. We discuss them one by one, so hopefully, at the end of this article, you won’t get into similar trouble.

We can group the mistakes into the following categories:

Falling in love with the solution, instead of the problem

Joshua Seiden in “Outcomes Over Outputs”, states that “features can be finished and delivered and worked perfectly but still not deliver any value”. When we fall in love with our solutions (a lot of times indeed), we selfishly forget the customer and focus on our desires instead. Therefore, it’s not a surprise that they don’t care and do the same to you.

Mistaking the MVP concept

Eric Ries introduced MVP as “a version of a new product which allows a team to collect the maximum amount of validated learning about customers with the least effort”. Marty Cagan, the product guru, has also defined the MVP as a prototype used in the discovery phase to learn from your customers. In the story above, the product manager wasted a lot of resources and time by mistaking MVP for a messy product (in need of serious refactoring) and didn’t learn anything helpful after a month. Remember the dropbox fake video MVP? It was not a product at all and yet did its sole mission of validating desirability risk perfectly.

Testing Ideas instead of assumptions

Teresa Torres has devoted a complete chapter of her wonderful book, Continuous Discovery Habits, to this pitfall so I would keep this short. When you test an idea, you are testing a set of underlying assumptions simultaneously. Take video recorded product reviews on Amazon for example. Let’s say the outcome was to increase the amount of data gathered by each review. If you test the whole idea right away, you won’t know why it did or did not work. Was it because the users couldn’t upload videos from their devices? Did they find it weird? Was their product digital or physical? And the list goes on. When you test assumptions instead, you learn from every move.

Making a yes/no choice

Instead of debating on whether the proposed solution is right or wrong, the product manager could ask the stakeholder what problem he thinks it solves and what are the other possible solutions. By turning a yes/no choice into a compare and contrast one, iterating on solutions based on invalidating assumptions becomes possible.

Conclusion

Next time you face a stakeholder trying to sell you an idea, remember the story of a product team stuck in a negative loop of incremental development and A/B testing hoping for a one-time success that can’t be learned from or copied elsewhere. I’ll cover more stuff on stakeholder management and other things in future articles.

Good luck building great products!

Bonus

As we reach the end of this article, I want to show you one of the most underrated types of MVP to test your next assumptions, and how to use it. ‘Piecemeal MVP’ is when you embed or utilize another product or service in your product in a new way. Why do I think it’s underrated? If you can put an ‘iframe’ on your website, you can easily tackle usability (and desirability) risks upfront with virtually no code.

Let’s say you’ve made a prototype in Figma but you’re uncertain of its usability. There are tools to test it with random users but your product may have a special customer segment. Besides, your users might behave differently in experiment environments, from their unconscious daily usage. With piecemeal MVP, you can export the Figma prototype as an iframe and embed it in your customer journey and watch the users’ interactions using tools like MS Clarity or Hotjar.

Be careful when you do so, as you just need a small percent of visitors in the test group (I use the Cochran formula for a rough estimate)

Another usage is when for instance you want to test whether people would like to provide helpful audio responses alongside their texts in product reviews or not (desirability risk). You may content yourself with a fake button but it won’t let you check helpfulness. Alternatively, you can embed a customized audio survey iframe into your website. There can be countless uses of this MVP that you can design with your creativity and I’d be happy to discuss this further in future articles if you are interested.

What to read next…

Training-CA-layer-of-strategy-FOOTER-scaled.jpg

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK