LP Login

Think Big. Move Fast.

The Economist in their latest technology quarterly review look at how user reviews stimulate ecommerce.

They find that once you have about 20 reviews of a product, you start to see increases in sales conversion rates:

The sheer volume of reviews makes far more difference, according to Google’s analysis of clicks and sales referrals. “Single digits didn’t seem to move the needle at all,” says Mr McAteer. “It wasn’t enough to get people comfortable with making that purchase decision.” But after about 20 reviews of a product are posted, “We start to see more reviews—it starts to accelerate,” says Sam Decker, the chief marketing officer of Bazaarvoice, a firm that powers review systems for online retailers.

His company’s research shows that visitors are more reluctant to buy until a product attracts a reasonable number of reviews and picks up momentum. In a test with Kingston, a maker of computer memory, Bazaarvoice collected reviews of Kingston products from the firm’s website and syndicated them to the website of Office Depot, a retailer. As a result there were more than ten reviews per product, compared with one or two for competitors’ offerings. The result was a “drastically” higher conversion rate, which extended even to other Kingston products that lacked the additional reviews.

Even if some reviews were negative, sales still increase:

Online retailers have generally been reluctant to allow users to leave comments, says John McAteer, Google’s retail industry director, who runs shopping.google.com, the internet giant’s comparison-shopping site. But a handful of bad reviews, it seems, are worth having. “No one trusts all positive reviews,” he says. So a small proportion of negative comments—“just enough to acknowledge that the product couldn’t be perfect”—can actually make an item more attractive to prospective buyers.

However, some books on Amazon now have thousands of reviews, more than enough for a potential buyer to draw an overall conclusion. So why do people continue to write new reviews for these products, even years afterwards?

Mr Shirky suggests that in many cases, writing a review is more like writing fan mail (or hate mail) for a product, and the people who post them do not really expect it to be read.

Whereas new people continue to write reviews long after a book is published, blog comments have quite a different set of behaviors.

“You can probably have a decent discussion until you get to about 350 comments,” says Markos Moulitsas, the founder of Daily Kos, a popular left-leaning political site. But after that, he says, “most outside people will stay away from the thread, and further growth will come from people already inside that thread carrying forth a discussion, debate, or argument.” Such discussion threads are more of a conversation, and the page they inhabit usually has a limited lifespan during which people continue to post—unlike the Amazon pages for the “Harry Potter” books, which continue to attract reviews even today, years after the books’ publication.

Part of this is because the “pivot” of user engagement for a review is the product, whereas the “pivot” of user engagement for a blog is the conversation thread. Since the product is evergreen to new users, it will continue to attract reviews. But a stale conversation in the comments to an old post is unlikely to draw in new comment. It is usually clear that the other debaters have moved on from the conversation, and there is little incentive to speak to an empty room. Knowing what’s the right primary pivot for your social media drives a lot of design decisions.

This is reinforced by design; many blogs alert you to new comments if you’ve commented on a blog post; almost no ecommerce stores alert you to new reviews of products that you have reviewed. As a result, blog comments turn into conversations between engaged participants whereas product reviews. As always, behavior and culture are a function of UI.

  • Pingback: Reviewing « Tish Tosh Tesh

  • Pingback: How Many Reviews Are Too Many? | Dehydration blog

  • http://www.eliainsider.com Elia Freedman

    Great article. I read it when you wrote it a few weeks ago and came back to find it and post. I have some proof that this might be accurate. We released our first iPhone app (we’ve done other mobile and desktop platforms in the past) for our financial calculator, FastFigures. We started with a solid base of users so got off to a very quick start but faded off the list very quickly (~8 days). We were very steady on sales for the next few weeks as our reviews piled up. (FastFigures has excellent reviews. 85% of US reviewers have given us 5 stars.) During this time we played with AppStore search terms and have maintained a Top 50 Paid Finance App ranking. (If you are not familiar, it puts FastFigures in the first two pages of Finance apps in the on-device iPhone AppStore.)

    Last week, though, FastFigures hit 18 reviews (again, in the US AppStore.) Our sales immediately shot up 50% and are still rising a few days later. Obviously, a limited sample and there are so many variables (competitor decisions, search rankings. etc.) but I know the sales weren’t influenced by external marketing because almost all of our efforts have been on understanding the AppStore and where our “organic” sales ranking would be.

    I will be writing a blog entry on our first month in the AppStore and plan on referencing this article. Very influential. Thanks!

  • Pingback: Twenty Terrific — A Reviews Revelation