Spotting Fake Reviews Online

August 30th, 2011 by Andrea Bennett

Helpful new software finds the paid plants.

Brought to you by Liberty Mutual's
The Responsibility Project

Think twice before you book that bed and breakfast getaway – there’s a chance the glowing reviews on sites like TripAdvisor and Yelp aren't authentic.

According to a recent report from the The New York Times, business owners are actively advertising for writers to post five-star reviews about their products and establishments. The article warns that, “An industry of fibbers and promoters has sprung up to buy and sell raves for a pittance,” as online retailers increasingly depend on sites like Yelp as marketing tools.

The Times found a Craigslist post that asked, “If you have an active Yelp account and would like to make very easy money please respond." One post on help-for-hire site Fiverr read, “For $5, I will submit two great reviews for your business.’” Another blogger, on the Digital Point forum, offered positive feedback on TripAdvisor for pay.

One former paid reviewer, Sandra Parker, spoke with the Times. She was hired by PR firms to write favorable book reviews for $10 each to be posted on sites like Amazon and Barnes & Noble. Parker told the Times, “We were not asked to provide a five-star review, but would be asked to turn down an assignment if we could not give one." The staff at the Times notes that her posts are, "stuffed...with superlatives like ‘a must-read’ and “a lifetime’s worth of wisdom.’”

The good news is, according to Travel Weekly, a group of researchers from Cornell University have developed new software that may save you the trouble of sifting through fake reviews.  In the researchers’ paper, “Finding Deceptive Opinion Spam by Any Stretch of the Imagination,” they claim their software can detect what they call “opinion spam,” or what TW called, “de-facto destination advertisements offered in the guise of a user review.”

To test the reliability of the software, the researchers commissioned freelance writers to produce 400 positive fake reviews of Chicago hotels. They mixed these fakes with 400 positive genuine reviews from TripAdvisor and asked three human judges to tell the reviews apart. According to the researchers, the software could spot the fake reviews in the test with 90 percent accuracy, compared with 50 percent accuracy on the part of humans.

 The researchers pointed out that the fake reviews used less detail and more superlative language (since the writers had never visited); words like “hotel,” “Chicago,” “my,” “experience” and “vacation” were giveaways. Real posts, on the other hand, contained specifics such as “floor,” “bathroom,” “small” and even dollar signs.

According to the Times, when news of the software broke, companies that rely on user-generated reviews, like TripAdvisor and Amazon, immediately contacted Cornell interested in using the new technology to quash bogus reviews. 

Why do we need software to separate truth from fiction? Jeffrey T. Hancock, a Cornell professor of communication and information science who worked on the project, told the Times, “We evolved over 60,000 years by talking to each other face to face. Now we’re communicating in these virtual ways. It feels like it is much harder to pick up clues about deception.”

Do you have your own set of tips for spotting bogus information on the web, or are you looking forward to a broad release of the anti-fraud software?