recentpopularlog in

charlesarthur : reviews   8

Have online reviews lost all value? • WSJ
Rebecca Dolan:
<p>Sephora.com reviews came under scrutiny in 2018 when emails posted to Reddit revealed that some staffers at skin care brand Sunday Riley were sent instructions for posting positive product reviews, including tips to create multiple fake accounts. Sunday Riley acknowledged the emails at the time via its verified Instagram account stating, “Yes, the email was sent by a former employee” and defending its actions by adding that “competitors often post negative reviews of products to swing opinion.” Sunday Riley didn’t respond to emails requesting comment. Sephora responded by sending a link to its terms for posting reviews, which require registering with an email.

The quid pro quo nature of digital relationships on apps like Uber has created ratings inflation; riders and drivers rarely score each other below four stars for fear of retaliatory ratings—especially since a low score can get you locked out of hitching future rides.

Online influencers generate a different kind of biased review; many who post about brands on social media are compensated with money or free products. Often, influencers are vague at best about these connections, unlawfully misleading at worst. In 2017, the FTC sent a letter to 91 influencers outlining the need to “clearly and conspicuously” disclose material connections in captions. A simple “thanks” to a brand, the FTC said, doesn’t make a connection sufficiently transparent for shoppers.

The only reviews you can absolutely trust are those from people you know, so many sites battling review scams offer ways to share recommendations with actual friends. And if you’re still looking for toothpaste, you’re better off asking a dentist anyway.</p>


The article is actually written in a "yes" and "no" form, and this is the "yes" (ie, online reviews have lost value). The "no" doesn't come close.
reviews  fraud  online 
12 days ago by charlesarthur
Facebook fake review factories uncovered by Which? investigation • The Guardian
Patrick Collinson:
<p>Undercover researchers for Which? set up dedicated Amazon and Facebook accounts and requested to join several of the “rewards for reviews” groups.

“They were instructed to order a specified item through Amazon, write a review and share a link to the review once it was published. Following the successful publication of the review, a refund for the cost of the item would then be paid via PayPal,” said Which?

But the Which? investigators turned the tables on the fake review factories by posting their honest opinion on the product.

In one example, the investigator gave the product – a smartwatch – a two-star review. “They were told by the seller to rewrite it because the product was free, so it “is the default to give five-star evaluation”, said Which?

In another, the investigator was told that a “refund will be done after a good five-star review with some photo” after receiving a pair of wireless headphones. But after posting a three-star review with photos they were told they would not be refunded unless they wrote a five-star review. The investigator refused, so did not get refunded for the purchase.

When the Guardian searched the Amazon UK Reviewers Facebook group – which has more than 25,000 members – it found postings appearing almost every couple of minutes from companies around the world offering to pay for positive reviews. For example, on Friday, one company was seeking “UK reviewers only” for a “4k action camera waits for review Refund via Paypal just send me your amazon profile”.</p>
amazon  reviews  fake 
october 2018 by charlesarthur
UK healthcare startup said to have posted fake reviews online • Bloomberg
Giles Turner:
<p>The two-year-old company [Cera Care] matches patients with in-home health-care professionals capable of providing everything from support for elderly clients to live-in assistance for people with dementia, and has garnered positive press from UK publications. Its website also boasts sky-high ratings on customer-satisfaction sites and partnerships with 10 UK National Health Service organizations.

But according to three people with knowledge of the matter, Cera Care doesn’t in fact have partnerships with at least seven of those groups, and up to a dozen of the reviews on third-party sites were crafted either by Cera Care employees or people close to them, rather than unbiased customers. These people asked not to be identified because of the sensitivity of the information.

In a widely publicized announcement in March 2017, Cera said it has partnered with a range of public health groups in London, including St. Barts, one of the U.K.’s largest. It later listed 10 of these groups on its website. However, the startup regularly works with only three of these groups - Lewisham CCG [Clinical Care Group], Haringey CCG, and Tower Hamlets CCG - according to people with knowledge of the matter.

“We note that our website was not fully up to date with these materials and are rectifying it," Cera said in an emailed statement. The company had previous partnerships with health care groups including Brent, Harrow and Hillingdon and East London Foundation Trust, the company said…

…Patients and their families considering private in-home care often look closely at reviews on sites such as Trustpilot AS, HomeCare UK and Google Reviews before choosing a provider. Of the 104 reviews on Trustpilot, a number were fakes created by Cera Care employees, people with knowledge of the postings said. Of them, 12 have been taken down in recent days. "Good customer service. Great care from care workers as well. Happily continuing to do business with Cera," one review said.

“We take any allegations of false reviews extremely seriously and these will be investigated thoroughly and dealt with strictly,” the spokeswoman for Cera, said. “We have no tolerance for this.”

Trustpilot said it has been investigating Cera Care and has been removing several reviews.</p>
cera  reviews  fake 
april 2018 by charlesarthur
Researchers taught AI to write totally believable fake reviews, and the implications are terrifying • Business Insider
Rob Price:
<p>there will soon be a major new threat to the world of online reviews: Fake reviews written automatically by artificial intelligence (AI).

Allowed to rise unchecked, they could irreparably tarnish the credibility of review sites — and the tech could have far broader (and more worrying) implications for society, trust, and fake news.

"In general, the threat is bigger. I think the threat towards society at large and really disillusioned users and to shake our belief in what is real and what is not, I think that's going to be even more fundamental," Ben Y. Zhao, a professor of computer science at the University of Chicago, told Business Insider.

Fake reviews are undetectable — and considered reliable
Researchers from the University of Chicago (including Ben Zhao) have written a paper ("<a href=“http://people.cs.uchicago.edu/~ravenben/publications/pdf/crowdturf-ccs17.pdf>Automated Crowdturfing Attacks and Defenses in Online Review Systems</a>") that shows how AI can be used to develop sophisticated reviews that are not only undetectable using contemporary methods, but are also considered highly reliable by unwitting readers.

The paper will be presented at the ACM Conference on Computer and Communications Security later this year.

Here's one example of a synthesised review: "I love this place. I went with my brother and we had the vegetarian pasta and it was delicious. The beer was good and the service was amazing. I would definitely recommend this place to anyone looking for a great place to go for a great breakfast and a small spot with a great deal."

There's nothing immediately strange about this review. It gives some specific recommendations and believable backstory, and while the last phrase is a little odd ("a small spot with a great deal"), it's still an entirely plausible human turn-of-phrase.</p>

Based on this, we’re either going to need better ways to identify humans, or online reviews are going the way of the dinosaur.
Reviews  ai  neuralnet 
august 2017 by charlesarthur
Scientology seeks captive converts via Google Maps, drug rehab centres • Krebs on Security
Brian Krebs:
<p>Experts say fake online reviews are most prevalent in labour-intensive services that do not require the customer to come into the company’s offices but instead come to the consumer. These services include but are not limited to locksmiths, windshield replacement services, garage door repair and replacement technicians, carpet cleaning and other services that consumers very often call for immediate service.

As it happens, the problem is widespread in the drug rehabilitation industry as well. That became apparent after I spent just a few hours with Bryan Seely, the guy who <a href="https://www.amazon.com/gp/product/1533156778/ref=as_li_tl?ie=UTF8&tag=seelysecurity-20&camp=1789&creative=9325&linkCode=as2&creativeASIN=1533156778&linkId=bb0aa50317eb9ed82a7919a56a7155b1">literally wrote the definitive book on fake Internet reviews</a>…

…Seely has been tracking a network of hundreds of phony listings and reviews that lead inquiring customers to fewer than a half dozen drug rehab centers, including <a href="https://en.wikipedia.org/wiki/Narconon">Narconon International</a> — an organization that promotes the theories of Scientology founder L. Ron Hubbard regarding substance abuse treatment and addiction.</p>


The word "<a href="http://www.urbandictionary.com/define.php?term=skeevy">skeevy</a>" seems appropriate for this practice.
drugs  maps  reviews 
june 2016 by charlesarthur
Amazon reviews hijacked by causes, conspiracies, rage » The Seattle Times
Jay Greene:
<p>Reviewers have long used Amazon as a platform to vent about products that failed to live up to their expectations. Some have even used it to attack authors whose views differ from their own.

Increasingly, though, people are launching coordinated campaigns to push political and social agendas through negative reviews often only tangentially related to the product for sale. They are able to do so because Amazon welcomes reviews regardless of whether the writer has actually purchased the product.

[The author of a book about Sandy Hook, Scarlett] Lewis isn’t the only target of the Sandy Hook tragedy deniers. “We want to hit this woman as hard as we can,” says a narrator in a YouTube video as he walks viewers through posting 1-star ratings and negative reviews for “Choosing Hope: Moving Forward from Life’s Darkest Hours,” by Sandy Hook Elementary first-grade teacher Kaitlin Roig-DeBellis. The video, posted by “Peekay22,” even guides viewers to click a “Yes” button indicating they found other negative reviews helpful.

Since Peekay22’s video posted on Oct. 16, “Choosing Hope” has received more than 170 1-star reviews out of just over 250 total reviews. That’s tanked the book’s rating down to 2.1 stars out of 5.

“Amazon is giving these people a forum … ,” Lewis said. “Obviously, Amazon should remove (the reviews).”

But Amazon appears to have no intent of doing so. To the company, as long as the reviews are “authentic,” they have a place on its website.

“All authentic reviews, whether the reviewer bought the product on Amazon or not, are valuable to customers, helping them make informed buying decisions every day,” Amazon spokesman Tom Cook wrote in reply to questions about its review policy.</p>


What about "whether they bought the product or not"?
amazon  reviews 
november 2015 by charlesarthur
After undercover sting, Amazon files suit against 1,000 Fiverr users over fake product reviews » GeekWire
Jacob Demmitt:
<p>Fiverr is an online marketplace that lets people sell simple services to strangers, like transcribing audio, converting photos or editing video. Amazon simply had to contact Fiverr users who advertised their review-writing services and set up the transaction.

The company said most people offered the undercover Amazon investigators 5-star reviews for $5 each.

One Fiverr.com user that went by bess98 offered to write the reviews from multiple computers, so as to deceive Amazon. Another user, Verifiedboss, unwittingly told the investigators, “You know the your [sic] product better than me. So please provide your product review, it will be better.”

As in the previous lawsuit, Amazon alleges that these reviewers often arranged to have empty boxes shipped to them in order to make it look like they had purchased the products.

Amazon is not suing Fiverr. The company noted in the court filing that these kinds of services are banned by Fiverr’s terms and conditions and Fiverr has tried to cut down on the practice.</p>


Would love to know which products these people reviewed.
amazon  reviews  fake  fiverr 
october 2015 by charlesarthur
Product reviews are broken » Above Avalon
Neil Cybart:
I still think the world needs independent product reviews. There is enough prior misbehavior on behalf of companies to suggest such third-party reviews can serve a purpose by giving consumers value. The problem is that many reviewers don't know what kind of value that is. The move into personalized wearables has largely turned the traditional tech gadget review into an artifact from a begone era. The nature of the tech review should have changed, but many tech reviewers haven't adapted their review process to this new wave of technology. While adding video may represent a new dimension to the review, the underlying premise of the review needs to be rethought.


I agree with Cybart. Reviews have turned into a mess; the desire on social networks to attract attention by being outrageous dilutes the thoughtful ones. And commenters' desire to attach a single value to a device's "worth" - is it one star, five stars? Why is that four stars but this five - wipes subtlety away in pursuit of a blunt distinction.
product  reviews 
april 2015 by charlesarthur

Copy this bookmark:





to read