Project: Product Reviews | Role: Product Manager
The Challenge.
For this project, I found myself in what is likely a familiar situation for fellow Product Managers. A stakeholder, or many stakeholders, got together and decided something on the site needing to be improved without fully understanding why they wanted it. And thus, “Improving Product Reviews” was born as high priority initiative for my team to explore.
What does “Improving Product Reviews” even mean?
We asked ourselves the same question, often. No one had a clear answer. So I put on my thinking cap and started digging.
Turns out, improving reviews wasn’t the real goal. Increasing reviews was…
Defining Goals.
Although the main idea on everyone’s lips at the start of this project was a complete review flow redesign, I started by taking several steps back. I knew that we really needed to dig deep to figure out what the actual goals of this request were. I don’t like to jump right to solutions without understanding what I’m trying to solve for. To do this, I gathered all the data I possibly could about reviews. What do they mean to customers? Why don’t more customers leave them? Do they offer any real benefit to downstream metics? Let’s find out…
Research.
Through customer research, we confirmed that access to reviews is an important force that can make or break a sale for a customer. Reviews offer critical information that aids in decision making. However, time and time again, customers we interviewed said that they are unlikely to leave a review without an incentive.
Key takeaways:
we need to find a way to make it quick and easy for customers to leave reviews
we need to encourage customers to leave a lot of details in their reviews to assist in purchasing decisions
Analytics.
Looking at the data, we found out that star rating and number of reviews are drivers of pdp-to-cart conversion (1% for each star; .02% for each review). However, nearly 50% of our products didn’t have any reviews, at all! We also uncovered a trend showing us that the review completion rate dropped significantly when we compared customers who were already authenticated when leaving a review to customers who were prompted to sign in when leaving a review.
Key Takeaways:
Having more reviews on our product pages can have a positive impact to downstream metrics
Customers who are presented with the log in gate when leaving a review are less likely to follow through to completion
A/B Testing.
Using what we learned through research and analytics analysis, I created a set of hypotheses that we could validate through A/B testing. The main KPI for both tests was increasing the number of reviews, and we of course monitored downstream metrics like CVR, Rev/Session, and UPT, as well.
If we remove the log in gate, we will increase the number of reviews per product page
If we add an additional write review CTA higher up on the product page, we will increase the number of reviews per product page
I partnered with the Analytics team to create two A/B tests to test out my theories. Turns out, my hunches were right!
Test 1: Removing Sign In Gate
Number of Reviews Submitted
Number of Customers Submitting Reviews
Measuring Success.
Although the sample size was relatively small during the test period, the improvement was noticeable. By removing the sign in gate, we increased the number of reviews left in the test group by up to 27% for some of our brands. Similarly, the number of customers leaving reviews saw a lift of up to 30% for some brands. We were only expecting things to improve from there, as more and more customers who were frustrated by the log in gate began to realize that the requirement was removed. Knowing the development effort to make this change was on the low side, this seemed like a no-brainer to prioritize for our roadmap. After all, more reviews = better customer experience and higher conversion rates!
Test 2: Additional Write Review CTA
Engagement with Write a Review CTA
Measuring Success.
What a quick win! By adding an additional write review call-to-action on the PDP, we increased engagement with the write review form significantly! Of course, opening the write review form isn’t enough. We still need people to fill out all the fields and click submit. Yet, combined with the increase in completion we saw by removing the log in gate, this second CTA pointed to real gains for us. When we considered the positive impact against the minor lift in engineering capacity to build, this was a no-brainer to prioritize.
Ongoing Improvements.
In addition to the improvements we saw by implementing both a second write review call-to-action, and removing the sign in gate, I returned to our original feature ideas to see if there were some additional quicks wins I could implement that would push us further! Making considerations for level of effort and potential return on investment, I identified a few additional low level of effort features that I thought we could implement without negatively impacting the delivery roadmap for the year. All of the below features were, or will soon be, completed with minimal development resources, allowing us to capitalize on the positive customer impact with little to no impact to other planned work.
Auto-Populating Reviews Data
After doing an audit of the types of questions we were asking on our various review templates, I found a common theme: many of the answers likely don’t change from product to product (e.g. eye color). I also realized that if we auto-populated those fields using information we knew about customers from their previous reviews, leaving a review would become as easy as 1-2-3!
Reviews Syndication
How do we get more reviews? Easy! I looked to products either shared across our internal brands, or sold on external sites. By sharing reviews across these brands, we could easily increase our review numbers without asking for customers to lift a finger. Best case scenario!
Open Write Review Tool Automatically
I also deep dove into every possible way customers come to our review forms. One of the highest entry points is from reminder emails; yet our emails used to drop customers on the PDP and then make them do all the work to find the review form. By simply adding a hash to the email links, we were able to automatically open the write review form when a customer clicks through from an email. Easy as pie, no?
Reflections.
Sometimes the best answer isn’t the obvious one. Although it seemed so easy to do what my stakeholders wanted and just redesign the review flow, by taking a step back and really digging into the true goals of the request, I was able to pivot us to smaller features that actually had a major and measurable impact to the sites. And the best part? They didn’t require multiple sprints worth of development capacity. We were able to do most of them with limited (or no) development work. This freed up my team to continue to deliver other major features alongside this product reviews work.
This project was also a fantastic way for me to partner with the Analytics team and develop clear and strong testing strategies to use in solving problems. By distilling down the very large ask of “improving reviews” to the more manageable “increasing reviews,” we had a clear problem to solve together.
Next time you leave a review for a product, remember that it might just make another customer’s day!