From the Marketplace Tech Report blog:
Online consumer reviews are everywhere. But how do you know if the review you’re reading is real? Was it written by an unbiased, unpaid consumer? How can you be sure it wasn’t written by the business owner’s cousin or, maybe, her fiercest competitor? Now researchers at Cornell say they have found an answer. Myle Ott, Jeff Hancock, Claire Cardie, and Yejin Choi teamed up to build software that can spot a fake review automatically 90 percent of the time. Real people fare much, much more poorly. Actual human beings are able to distinguish between a fake and real reviews just 50 percent of the time
The Consumerist provides a link to some more details with a "PDF demonstration of the software's capabilities.
CNET notes "the Cornell system is similar to software that sniffs out plagiarism. While the plagiarism software learns to spot the type of language a specific author uses, the Cornell software learns to spot the type of language people use when they're being deceptive in writing a review, said Myle Ott, a Cornell computer science graduate student on the research team." So how can you distinguish between real and fake reviews?
In part, deceptive writers used more verbs than real review writers did, while the real writers used more punctuation than the deceptive writers. The deceptive writers also focused more on family and activities while the real writers focused more on the hotels themselves.
I've read many reviews on Amazon and similar sites that seem too detailed and precise to be "real." Hopefully this kind of software will help consumers to find some honest opinions. How much do you rely upon online reviews to make buying decisions?