| HBS Working Paper Series
Optimal Aggregation of Consumer Ratings: An Application to Yelp.com
Consumer review websites such as Yelp.com leverage the wisdom of the crowd, with each product being reviewed many times (some with more than 1,000 reviews). Because of this, the way in which information is aggregated is a central decision faced by consumer review websites. Given a set of reviews, what is the optimal way to construct an average rating? We offer a structural approach to answering this question, allowing for (1) reviewers to vary in stringency (some reviewers tend to leave worse reviews on average) and accuracy (some reviewers are more erratic than others), (2) reviewers to be influenced by existing reviews, and (3) product quality to change over time. We apply this approach to reviews from Yelp.com to derive optimal ratings for each restaurant (in contrast with the arithmetic average displayed by Yelp). Because we have the history of reviews for each restaurant and many reviews left by each reviewer, we are able to identify these factors using variation in ratings within and across reviewers and restaurants. Using our estimated parameters, we construct optimal ratings for all restaurants on Yelp and compare them to the arithmetic averages displayed by Yelp. As of the end of our sample, a conservative finding is that roughly 25%–27% of restaurants are more than 0.15 stars away from the optimal rating, and 8%–10% of restaurants are more than 0.25 stars away from the optimal rating. This suggests that large gains could be made by implementing optimal ratings. Much of the gains come from our method responding more quickly to changes in a restaurant's quality. Our algorithm can be flexibly applied to many different review settings.
Demand and Consumers;
Social and Collaborative Networks;
Food and Beverage Industry;