There’s no doubt about it – I get swayed by the user reviews on Amazon or TripAdvisor. If I’m on the edge of booking a hotel, it consoles me to know that the last three ‘normal’ people who stayed there found the rooms clean and the staff helpful. If I’m not sure about buying a book or an album, the fact that 89 out of 100 readers gave it five stars definitely influences me.
But am I just being gullible? How many of these reviews are fake? Are my desires and choices just the result of some marketing scam?
As online retailers increasingly depend on reviews as a sales tool, an industry of fibbers and promoters has sprung up to buy and sell raves for a pittance.
“For $5, I will submit two great reviews for your business,” offered one entrepreneur on the help-for-hire site Fiverr, one of a multitude of similar pitches. On another forum, Digital Point, a poster wrote, “I will pay for positive feedback on TripAdvisor.” A Craigslist post proposed this: “If you have an active Yelp account and would like to make very easy money please respond.”
The boundless demand for positive reviews has made the review system an arms race of sorts. As more five-star reviews are handed out, even more five-star reviews are needed. Few want to risk being left behind.
Sandra Parker, a freelance writer who was hired by a review factory this spring to pump out Amazon reviews for $10 each, said her instructions were simple. “We were not asked to provide a five-star review, but would be asked to turn down an assignment if we could not give one,” said Ms. Parker, whose brief notices for a dozen memoirs are stuffed with superlatives like “a must-read” and “a lifetime’s worth of wisdom.”
So what are they doing about it?
Determining the number of fake reviews on the Web is difficult. But it is enough of a problem to attract a team of Cornell researchers, who recently published a paper about creating a computer algorithm for detecting fake reviewers. They were instantly approached by a dozen companies, including Amazon, Hilton, TripAdvisor and several specialist travel sites, all of which have a strong interest in limiting the spread of bogus reviews.
“Any one review could be someone’s best friend, and it’s impossible to tell that in every case,” said Russell Dicker, Amazon’s director of community. “We are continuing to invest in our ability to detect these problems.”
The Cornell researchers tackled what they call deceptive opinion spam by commissioning freelance writers on Mechanical Turk, an Amazon-owned marketplace for workers, to produce 400 positive but fake reviews of Chicago hotels. Then they mixed in 400 positive TripAdvisor reviews that they believed were genuine, and asked three human judges to tell them apart. They could not.
So the team developed an algorithm to distinguish fake from real, which worked about 90 percent of the time. The fakes tended to be a narrative talking about their experience at the hotel using a lot of superlatives, but they were not very good on description. Naturally: They had never been there. Instead, they talked about why they were in Chicago. They also used words like “I” and “me” more frequently, as if to underline their own credibility.
So we can’t tell the difference between real and fake reviews; but a computer can. I’m not sure how consoling that is. We are left depending on the reviews, and trusting that the supercomputer in the background is doing all the necessary screening. Maybe we won’t get any further than that for now. What reassures me is that I do believe its in the best interests of Amazon and TripAdvisor etc. to get this right, and to find some way of preserving only the genuine reviews; because when the trust breaks down, they’ll lose the readers. But am I being naive again?