The Yelp Review Filter is Broken

people hate us on yelp

It is no secret that user reviews greatly influence consumers’ purchasing decisions. According to a study from Harvard Business School, a one-star increase on Yelp, a popular rating website, led to a 5 to 9 percent increase in revenue for businesses. How do you know when a review is fake? Many of the largest user review websites, like Yelp and Amazon.com, are trying to come up with automated ways to eliminate fake reviews. But how effective are their algorithms? How many authentic reviews are thrown out in the process?

In our previous communications analytics research, How to Spot a Fake Online Review, we analyzed a corpus of reviews to unveil the identifying linguistic trends among fake reviews. These results allow us to objectively evaluate the effectiveness of the tools currently being used to find fake reviews and test them against our own findings.

How Yelp Approves Reviews: Yelp uses a computer algorithm to filter out the reviews that they consider to be potentially fraudulent. These filtered reviews are not figured into a business’ overall score, but they are still available to see. In order to make it harder to game the system, Yelp does not publicly disclose how their filtering process works. They do, however, state that they will err on the side of the consumer, using a conservative approach which sometimes places authentic reviews in the filtered section.

How We Analyzed Yelp’s Data: Yelp provides public access to both their accepted reviews as well as the reviews they have filtered out for being “suspicious.” We ran our quantified communications analytics on 50 “approved” restaurant reviews from Yelp and compared them to 50 “removed” reviews of the same restaurants that Yelp filtered out.

What We Found: When analyzing the effectiveness of Yelp’s filter, we were initially pleased to see that some of our findings on deceptive language correspond with Yelp’s filter. Our research demonstrates that deceptive writers tend to separate themselves from their writing and offer deep insight instead of direct information. Yelp’s filtered deceptive reviews contained 271.4% more references to family than the authentic reviews and contained 28.5% more insight. However, there are major differences between our knowledge of what makes language deceptive and Yelp’s reviews that suggest many authentic reviews are being filtered out. When compared to the reviews Yelp accepted as authentic, we found the reviews that Yelp’s filter discarded to have the following characteristics:

  • 33.8% more positive
  • 66.5% shorter
  • 2.8% fewer stories

Why the Yelp Review Filter is Broken: Yelp approves reviews with essential attributes that we have linked to deception and discards reviews with characteristics associated with authentic reviews.

  • 33.8% more positive – studies have shown that deceptive language is often linked to negative sentiment, due to feelings of guilt or discomfort while lying. However, we found the language in Yelp’s filtered reviews to be 33.8% more positive than the accepted reviews.
  • 66.5% shorter – research suggests that deceptive language tends to be longer due to a perceived need for more elaborate explanations. Yet, we found the reviews that Yelp filtered out to be 66.5% shorter than the reviews the algorithm accepted.
  • 2.8% fewer stories – Our previous research revealed that deceptive reviews often discuss topics not related to the product or service at hand. This is most likely due to fake reviewers having little to no experience with what they are reviewing.

Why Others Agree: Many claim Yelp’s filtering process is unfair. It is argued that Yelp hides legitimate positive reviews, most likely because the reviewers themselves are not very active on Yelp. A working paper from Harvard Business School found that more than 70% of Yelp accounts that had only written one review had their review filtered out. Some businesses have even tried suing Yelp for being unfair.

Yelp is Not the Only One with a Broken Review Filter: Many reviewers also publically express confusion when their authentic reviews disappear from Amazon.com. Like Yelp, Amazon.com looks at the profile of the reviewer to help determine the authenticity of a particular review. They will remove book reviews from people who have any affiliation with the author, even if they are genuine fans of the book.

In conclusion, there is still a lot to learn about differentiating between genuine and fake reviews. We know that reviews are important and that they drive sales, pricing, popularity, and reputation. We know that companies and online platforms are paying attention, and we know that not every review is honest and many of them are paid. Nevertheless, while communication analytics can help to reveal common characteristics and linguistic trends among fake reviews, the discrepancy between our scientific results and the largest user review websites demonstrate that much remains to be discovered.