recentpopularlog in

charlesarthur : trolls   14

'Captain Marvel' shows how the culture war is making user reviews useless • Motherboard
Samantha Cole:
<p>If I happened to check out the audience score on Rotten Tomatoes before deciding what movie I wanted to see at the theatre that night, I would have gotten the impression that Captain Marvel is a waste of time and money. If I spent a minute Googling it I would have discovered that these negative reviews were coming from people whose opinion on this subject could not matter less to me, but how would I know to do that?

The people who were leaving negative reviews were "review bombing," a tactic that's getting more and more common, and platforms still don’t know how to handle it. That's a problem. User reviews are now just another battlefield in the greater culture war that is devouring the world. This makes them mostly useless when it comes to movies like Captain Marvel, or any product or service that gets caught up in the culture war.

In a statement published on its blog on February 25, Rotten Tomatoes said that it was making changes to its pre-release functionality, including no longer allowing users to comment or review movies prior to their release in theaters.

“However, we still invite users to vote if they 'want to see' a movie prior to its release, and that vote total is displayed on the site," the statement read. “Unfortunately, we have seen an uptick in non-constructive input, sometimes bordering on trolling, which we believe is a disservice to our general readership.”</p>

Oh, and also: <a href="">Captain Marvel took $455m worldwide on its first weekend</a>, the sixth-biggest opening weekend ever and the biggest ever for a female-led film.
captainmarvel  trolls  films 
7 days ago by charlesarthur
Where trolls reigned free: a new history of reddit • The New York Times
David Streitfeld reviews a new book about reddit:
<p>The title “We Are the Nerds” doesn’t really fit the tale. “We Are the Trolls” would have made much more sense. “I was always kind of an [expletive],” [co-founder Steve] Huffman explains early on. [The author, Christine] Lagorio-Chafkin bluntly calls him “a total troll.” He was also a genius programmer. The great achievement of the social internet was to unleash jerkdom for many while monetizing it for a few.

The Reddit tale is an indictment of Silicon Valley, something Lagorio-Chafkin seems to sense but never confronts head-on, perhaps because she is so grateful for access to Huffman and [co-founder Alexis] Ohanian. “Two nice guys who made it, by crafting something incredible and yet ridiculously unwieldy, with no lack of turbulence along the way,” Lagorio-Chafkin writes in an author’s note. A more accurate summation might be: “Two inexperienced young guys created something they didn’t understand and couldn’t control.”

It’s all here anyway: the lack of adult oversight; the suck-up press; the growth-at-any-cost mentality; the loyal employees, by turns abused and abusive (memo from management: “You do realize you were talking about penises for 90 minutes, right?”); the defense of horrendous behavior as “free speech”; the jettisoning of “free speech” when it served corporate purposes; the way no one seeks permission but all expect forgiveness…

…Reddit became so offensive it was difficult to work there. A community manager who had a brief tenure in 2015 told Lagorio-Chafkin some of the reasons: “Child molesters, child porn, vicious stalking, rape threats, serious harassment, people taking the harassment offline and people filing police reports on each other.” One chief executive, stressed beyond endurance, simply stopped showing up for work. His replacement, Ellen Pao, tried to impose order in the office and on the site. The backlash led to her abrupt departure. Huffman returned and purged most of the staff.</p>

Right, because purging the staff would accomplish..? At least we're getting a history of this period of the internet.
trolls  reddit 
november 2018 by charlesarthur
Nearly 600 Russia-linked accounts tweeted about the US's health law • WSJ
Paul Overberg:
<p>On the March 23 anniversary of the Affordable Care Act becoming law, Democrats attacked Republicans for trying to sabotage the health law and praised the embattled legislation.

So did Russian trolls.

“8 years ago today, President Obama signed the Affordable Care Act into law. Millions of Americans have gained access to health care. Thank you, Mr. President!” said a tweet linked to the Internet Research Agency, a Russian company engaged in an online influence campaign that typically seeks to pit one side against the other on controversial issues.

A newly identified group of nearly 10,000 tweets shows that while Russian trolls often focus on such hot-button issues as Hillary Clinton’s email or athletes kneeling during the national anthem, they also target substantive and divisive policy areas like health care.

Nearly 600 IRA-linked accounts posted to Twitter about the ACA and health policy from 2014 through this past May, with the most prolific ones tweeting hundreds of times, the new data show. One account, called TEN_GOP, rocketed from fewer than 1,000 followers to more than 138,000 in two years, sending 60 tweets that potentially reached followers more than four million times.

Researchers at Clemson University provided The Wall Street Journal with the set of about 9,800 tweets involving health policy and the ACA that the IRA posted over that period. An analysis by the Journal found that 80% of the tweets had conservative-leaning political messages, often disparaging the health law.

The accounts have been shut down by Twitter as congressional investigators unearthed their origin, but intelligence experts say the assault is continuing through similar accounts and channels.</p>
Trolls  twitter 
september 2018 by charlesarthur
The conventional wisdom about not feeding trolls makes online abuse worse • The Verge
"Film Crit Hulk":
<p>Whether we’re talking about AOL, AIM, early 4chan, or the early days of Twitter, there has always been a myth about the time and place where things were more innocent, when trolling was all in good fun. But what everyone really remembers about these proverbial times isn’t their purity. It’s how they didn’t see the big deal back then. They remember how they felt a sense of permission, a belief that it was all okay. But that was only true for those who were like them, who thought exactly like they did. All the while, someone else was getting stepped on and bullied while others laughed. The story of the internet has always been the same story: disaffected young men thinking their boorish and cruel behavior was justified or permissible.

And it was always wrong.

The second great lie is that trolling is harmless…

…The third great lie is about what fixes it…

The premise of “don’t feed the trolls” implies that if you ignore a troll, they will inevitably get bored or say, “Oh, you didn’t nibble at my bait? Good play, sir!” and tip their cap and go on their way. Ask anyone who has dealt with persistent harassment online, especially women: this is not usually what happens. Instead, the harasser keeps pushing and pushing to get the reaction they want with even more tenacity and intensity. It’s the same pattern on display in the litany of abusers and stalkers, both online and off, who escalate to more dangerous and threatening behavior when they feel like they are being ignored. In many cases, ignoring a troll can carry just as dear a price as provocation.</p>

Terrific article. I feel as though in technology, the hardware business is in stasis generally. Now we're trying to work out the social and software side.
socialwarming  internet  abuse  trolls 
july 2018 by charlesarthur
“Just an ass-backward tech company”: how Twitter lost the internet war • Vanity Fair
Maya Kosoff:
<p>At the same time, her defenders say, [head of Twitter Trust & Safety, Del] Harvey has been forced to clean up a mess that Twitter should have fixed years ago. Twitter’s backend was initially built on Ruby on Rails, a rudimentary web-application framework that made it nearly impossible to find a technical solution to the harassment problem. If Twitter’s co-founders had known what it would become, a third former executive told me, “you never would have built it on a Fisher-Price infrastructure.” Instead of building a product that could scale alongside the platform, former employees say, Twitter papered over its problems by hiring more moderators. “Because this is just an ass-backward tech company, let’s throw non-scalable, low-tech solutions on top of this low-tech, non-scalable problem.”

Calls to rethink that approach were ignored by senior executives, according to people familiar with the situation. “There was no real sense of urgency,” the former executive explained, pointing the finger at Harvey’s superiors, including current CEO Jack Dorsey. “It’s a technology company with crappy technologists, a revolving door of product heads and CEOs, and no real core of technological innovation. You had Del saying, ‘Trolls are going to be a problem. We will need a technological solution for this.’” But Twitter never developed a product sophisticated enough to automatically deal with with bots, spam, or abuse.</p>

I've known Del Harvey for years, as a journalist, so I'm probably a bit biased. But she's not failing; Twitter's problem is its drive for users instead of quality. It lives up to Mark Zuckerberg's dismissive comment that "it's a clown car that drove into a gold mine."
Twitter  trolls 
february 2018 by charlesarthur
Trolls on twitter: how mainstream and local news outlets were used to drive a polarized news agenda • Medium
Jonathan Albright has done extensive (as in, <em>really</em> extensive work on how (Russian-controlled?) troll accounts went to work in the US election:
<p>The chart below is the top-line breakdown of where these 11-plus thousand external links in my set of 36.5k troll tweets from 2016 pointed to. This includes the expanded short URLs and redirects. This shows the news outlets the troll accounts (through tweeting, retweeting, and tweet-quoting) tended to re-broadcast from the middle of 2016 through election day:

<img src="*qxzDom0huWdY1pf0s3_vnQ.png" width="100%" />
<em>Top 25 most-linked news sources across 11.5k troll tweets (using thousands of expanded short links)</em>

Looking at this breakdown, a result from this sample of tens of thousands of tweets is that the most-shared news outlets from 11.5k links across 388 troll accounts in the six months leading up to the election isn’t your typical hyper-partisan “fake news.”

Sure, Breitbart ranks first, but it’s followed by a long list of what many would argue are credible — if not mainstream — news organizations, as well a surprising number of local and regional news outlets.

Another result from this analysis is the effect of “regional” troll accounts, aka the fake accounts with a city or region name in the handle (e.g., HoustonTopNews, DailySanFran, OnlineCleveland), which showed a pattern of systematically re-broadcasting local news outlets’ stories.

The linking pattern is also consistent: a large number of story links are Bitly-wrapped, and links to local outlets often originate through RSS or Google Feedproxy — to some degree co-opting local outlets’ content streams in an attempt to establish themselves and connect with local audiences.</p>

The collapse in local news outlets in the US (largely mirrored in the UK) magnifies this effect.
Russia  trolls  twitter  us  election  interference 
february 2018 by charlesarthur
Russian trolls ran wild on Tumblr and the company refuses to say anything about it • Buzzfeed
Craig Silverman:
<p>Russian trolls posed as black activists on Tumblr and generated hundreds of thousands of interactions for content that ranged from calling Hillary Clinton a “monster” to supporting Bernie Sanders and decrying racial injustice and police violence in the US, according to new findings from researcher Jonathan Albright and BuzzFeed News.

While Facebook and Twitter continue to face intense public and congressional pressure over the activity from trolls working for the Russian Internet Research Agency, Tumblr has somehow managed to escape scrutiny. But the blogging platform was in fact home to a powerful, largely unrevealed network of Russian trolls focused on black issues and activism.

“The evidence we've collected shows a highly engaged and far-reaching Tumblr propaganda-op targeting mostly teenage and twenty-something African Americans. This appears to have been part of an ongoing campaign since early 2015,” said Albright, research director of the Tow Center for Digital Journalism at Columbia University.

Tumblr and its parent company, Oath, did not reply to multiple emails with questions from BuzzFeed News. Despite not responding, tracking software shows the emails were opened more than 290 times, and the included links were clicked more than 70 times.

BuzzFeed News also did not receive a response from the office of Sen. Mark Warner, the Democratic chair of the Senate Intelligence Committee, which is investigating Russian interference in the 2016 election.</p>

That’s a lot of clicking and sharing of those questions.
Buzzfeed  russia  trolls  tumblr 
february 2018 by charlesarthur
The fake Americans Russia created to influence the election • The New York Times
Scott Shane:
<p>Sometimes an international offensive begins with a few shots that draw little notice. So it was last year when Melvin Redick of Harrisburg, Pa., a friendly-looking American with a backward baseball cap and a young daughter, posted on Facebook a link to a brand-new website.

“These guys show hidden truth about Hillary Clinton, George Soros and other leaders of the US,” he wrote on June 8, 2016. “Visit #DCLeaks website. It’s really interesting!”

Mr. Redick turned out to be a remarkably elusive character. No Melvin Redick appears in Pennsylvania records, and his photos seem to be borrowed from an unsuspecting Brazilian. But this fictional concoction has earned a small spot in history: The Redick posts that morning were among the first public signs of an unprecedented foreign intervention in American democracy.

<img src="" width="100%" />
<em>A Facebook post, by someone claiming to be Melvin Redick, promoting a website linked to the Russian military intelligence agency G.R.U. Credit The New York Times</em>

The DCLeaks site had gone live a few days earlier, posting the first samples of material, stolen from prominent Americans by Russian hackers, that would reverberate through the presidential election campaign and into the Trump presidency. The site’s phony promoters were in the vanguard of a cyberarmy of counterfeit Facebook and Twitter accounts, a legion of Russian-controlled impostors whose operations are still being unraveled.</p>

This is quite an investigation, done by the NYT with FireEye.
facebook  russia  trolls  election 
september 2017 by charlesarthur
The day an army of bots turned on bot researchers • Daily Beast
Joseph Cox:
<p>On Aug. 18, DFR Lab <a href="">published an analysis</a> on how U.S. alt-right platforms mimicked the sentiment of pro-Russian outlets concerning Charlottesville. The following week, ProPublica picked up the story, but something strange happened: Apparent bots quickly retweeted the article thousands of times.
A day later, an account with just 74 followers described investigative journalism news operation ProPublica as an “alt-left #HateGroup and #FakeNews site funded by Soros.” That tweet racked up some 23,000 retweets, seemingly from a group of bots. A similar tweet managed to grab more than 12,500 retweets. Ben Nimmo, a senior fellow at DFR Lab, then wrote his own analysis of <a href="">the tweets against ProPublica</a>, and <a href="">a guide on how to spot a bot</a>.

Those retweet bots don’t really help propagate a tweet: Most probably don’t have any followers who are real users. Instead, their goal is likely to saturate a target’s notifications.

“They are not amplifying the accounts, but what they are doing is intimidating the users,” Nimmo told The Daily Beast. “They’re standing in an empty room, shouting really, really, loudly.”

But things got weirder.

“The Atlantic Council’s tweets, which are normally retweeted a couple dozen times, got retweeted almost 108,000 times and some of us got loads of fake new followers,” Donara Barojan, also from the DFR Lab, told The Daily Beast. She gained more than 1,000 new Twitter followers, most of which <a href="">appeared to be automated accounts</a>.

Barojan said most of the bots that followed her don’t tweet. But the automated accounts have been on Twitter for years.</p>

It's that latter point - that the accounts have been there for years - which always intrigues me. Were they planted there years ago? Bought from spammers who seeded them a long time ago? Hacked more recently (my guess)? Remember that <a href="">Adrian Chen's canonical article about paid Russian trolls</a> dates from June 2015, and describes events from mid-2014 onwards. And re-read that article, which contains this:
<p>The boom in pro-Kremlin trolling can be traced to the antigovernment protests of 2011, when tens of thousands of people took to the streets after evidence of fraud in the recent Parliamentary election emerged. The protests were organized largely over Facebook and Twitter and spearheaded by leaders, like the anticorruption crusader Alexei Navalny, who used LiveJournal blogs to mobilize support. The following year, when Vyascheslav Volodin, the new deputy head of Putin’s administration and architect of his domestic policy, came into office, one of his main tasks was to rein in the Internet. </p>

Perhaps Russia really has been playing a long, long game.
russia  trolls  twitter 
august 2017 by charlesarthur
We’re all internet trolls (sometimes) • WSJ
Christopher Mims:
<p>Admit it: At one point or another, you have probably said something unpleasant online that you later regretted—and that you wouldn’t have said in person. It might have seemed justified, but to someone else, it probably felt inappropriate, egregious or like a personal attack.

In other words, you were a troll.

<a href="">New research</a> by computer scientists from Stanford and Cornell universities suggests this sort of thing—a generally reasonable person writing a post or leaving a comment that includes an attack or even outright harassment—happens all the time. The most likely time for people to turn into trolls? Sunday and Monday nights, from 10 p.m. to 3 a.m.

Trolling is so ingrained in the internet that, without even noticing, we’ve let it shape our most important communication systems. One reason Facebook provides elaborate privacy controls is so we don’t have to wade through drive-by comments on our own lives.</p>
march 2017 by charlesarthur
Trolls decided I was taking pictures of Rex Tillerson’s notes. I wasn’t even there. • The Washington Post
Doris Truong was convicted in her absence on Twitter of having been the (also Asian, also female) person who seemed to be taking pictures of Tillerson's notes during a break in his confirmation hearing:
<p>Even more bizarrely, one Twitter user insisted that “facial software on the video” led to the “<a href="">almost positive</a>” conclusion that the woman was me.

But even if people believed that the person at the hearing wasn’t me, they wanted to know who she was. And that’s what’s particularly alarming about this time in our society: Why are people so quick to look for someone to condemn? And during the confusion about the woman’s identity, why is it presumed that she is a journalist? Or that taking pictures of notes in an open hearing is illegal? Or, for that matter, that she was even taking pictures of Tillerson’s notes?</p>

The urge to accuse is extraordinarily strong online. If you renamed Facebook to "Pitchforks" and Twitter to "Flaming Torches" you'd pretty much have it.
twitter  trolls  accusation 
january 2017 by charlesarthur
One delightfully clever & slightly Shakespearean way to fight trolls • Flare
Alanna Evans:
<p>Since she started speaking out more vocally against Trump, [Summer] Brennan has received intense attacks by trolls and troll botnets (a hijacked network of malware-infected, remote-controlled computers that send spam messages).

“There has been a significant increase in my Twitter traffic in general, but with that came what was for me an unprecedented troll onslaught,” Brennan told FLARE. “Much of it was just juvenile taunting, like calling me a ‘special snowflake’ or a ‘libtard,’ but there were also violent drawings of women being beaten or killed, and threats to my life, safety and privacy.” She was also told that she “belonged in an oven” and “should be euthanized,” and she was sent photos of Hitler alongside words like that.

Brennan began to notice a common thread to the insults she received: her online abuse was incredibly gendered. To see if she could avoid the attacks, Brennan decided to try a Twitter experiment: if the trolls could assume anonymous avatars, why couldn’t she play with her online persona? 

“I wondered if changing my picture to that of a man would lessen the flow of abuse,” says Brennan. “So I decided to change my photo to that of my brother, and picked a photo in which he was wearing a tie—a white collar white guy. I didn’t use a random photo because I figured, well, this is what I’d probably look like if I were born a man, so, I’ll use that.”

On December 1, Brennan uploaded her brother’s photo to her Twitter account (a bit of Twelfth Night, or What You Will) and reduced her first name to “S.C.”—but kept the same handle.

For 48 hours, the effect on her mentions was astounding.

“The stream of abuse stopped almost immediately,” says Brennan. “There was probably a 99-percent reduction in trolling.”

Brennan was open with her followers about what she doing, and didn’t change her Twitter banner (which proudly displays her latest book and identifies her as a female author). Apparently assuming a new profile photo was enough to silence the trolls. </p>
trolls  gender 
december 2016 by charlesarthur
Former Reddit CEO Ellen Pao: the trolls are winning the battle for the Internet » The Washington Post
To understand the challenges facing today’s Internet content platforms, layer onto that original balancing act a desire to grow audience and generate revenue. A large portion of the Internet audience enjoys edgy content and the behavior of the more extreme users; it wants to see the bad with the good, so it becomes harder to get rid of the ugly. But to attract more mainstream audiences and bring in the big-budget advertisers, you must hide or remove the ugly.

Expecting Internet platforms to eliminate hate and harassment is likely to disappoint. As the number of users climbs, community management becomes ever more difficult. If mistakes are made 0.01% of the time, that could mean tens of thousands of mistakes. And for a community looking for clear, evenly applied rules, mistakes are frustrating. They lead to a lack of trust. Turning to automation to enforce standards leads to a lack of human contact and understanding. No one has figured out the best place to draw the line between bad and ugly — or whether that line can support a viable business model.

The basic problem is that we remember the vicious words and acts more than the kind ones; possibly we're evolutionarily set out that way.
reddit  trolls 
july 2015 by charlesarthur
The Agency »
Adrian Chen on the "Internet Research Agency" in St Petersburg, which has industrialised the practice of trolling social media to promote Russian interests:
[Ludmila] Savchuk’s revelations about the agency have fascinated Russia not because they are shocking but because they confirm what everyone has long suspected: The Russian Internet is awash in trolls. “This troll business becomes more popular year by year,” says Platon Mamatov, who says that he ran his own troll farm in the Ural Mountains from 2008 to 2013. During that time he employed from 20 to 40 people, mostly students and young mothers, to carry out online tasks for Kremlin contacts and local and regional authorities from Putin’s United Russia party. Mamatov says there are scores of operations like his around the country, working for government authorities at every level. Because the industry is secretive, with its funds funneled through a maze of innocuous-sounding contracts and shell businesses, it is difficult to estimate exactly how many people are at work trolling today. But Mamatov claims “there are thousands — I’m not sure about how many, but yes, really, thousands.”

That, though, is only the amuse-bouche. Then Chen finds himself dragged down the rabbit hole. Today's must-read.
russia  trolls  socialmedia 
june 2015 by charlesarthur

Copy this bookmark:

to read