recentpopularlog in

robertogreco : gamergate   5

Can the online community be saved? Is it even worth saving? - The Globe and Mail
"It seems quaint now to speak of online communities in romantic terms. I’ll do it anyway. For the past few decades, we’ve been in love with them.

What made them so appealing was the way that made the world suddenly seemed to open up. Bulletin boards, and then forums, then blogs allowed everyone from knitting enthusiasts to politics nerds to find and talk to others who shared their interests or views. We liked that, and made hanging out there a mainstay of life. But as can happen with love, things can sour bit by bit, almost imperceptibly, until one day you awake and find yourself in toxic relationships.

It wasn’t always this way. Years ago, in the mid-2000s, I sat in a Toronto basement apartment, adding my thoughts to posts on a site called Snarkmarket, which delved into the artsy and philosophical sides of technology and media. To my mind, these wide, wild, intimate discussions seemed to capture everything wonderful about the new modern age: I found like-minded individuals and, eventually, a community.

And then, I was on a plane, flying over the deeply blue waters of the Gulf of Mexico in November, 2013. Somehow, a blog comment section had led me from Toronto to Florida. A group flew in from all over the continent to St. Petersburg, and brought our online discussions to life around tables replete with boozy pitchers shared on patios in the thick Florida air. Putting faces to usernames made fleeting connections feel more solid, and years later, a small number of us are still in touch: so much for the alienating nature of technology.

It does, however, already feel like a different era, and that such recent history can seem so far away brings with it a strange sense of vertigo. Logging on each morning now, I sometimes forget why I ever had so much faith in all this novelty, and wonder if it can be saved at all.

The first fault line was when the centre of gravity of our online socializing shifted to giant platforms such as Facebook, Twitter, Instagram, Tumblr and more. With that shift to mainstream sites composed of tens or hundreds of millions of users colliding together in a riot of opinion and expression, online communities started to seem unwelcoming, even dangerous places."



"It is tempting to say, then, that the solution is simple: barriers. A functioning community should draw a line around the kind of people it wants, and keep others out. But that’s also demoralizing in its own way. It suggests those lofty ideals that we could find community with people of all sorts across the globe are well and truly dead, forever.

Anil Dash doesn’t believe they are – at least not fully. A mainstay in the American tech scene after founding the blogging platform Typepad in the early 2000s, he has been vocal in his disappointment that platforms such as Twitter have been slow in responding to abuse. “The damage can be done now is so much more severe because everyone is on these networks and they have so much more reach,” he says on the phone from New York. “The stakes are now much higher.”"



"At a scale of tens of thousands or even millions of people, it’s not just notions of community that are lost, but norms, too, where what would be obvious offline – don’t yell at someone to make a point, don’t dominate a conversation just because you can, and so on – are ignored because of the free-for-all vibe of much social media.

Britney Summit-Gil, a writer, academic and researcher of online communities at Rensselaer Polytechnic Institute in New York, suggests that while sites such as Facebook and Reddit can be full of hate and harassment, there are increasingly effective tools to build smaller, more private spaces, both on those platforms, and on other sites such as messaging app Slack, or even group text chats.

Summit-Gil also argues that in adopting the idea of community, these huge platforms are responsible for endorsing the principle of guidelines more generally: rules for how and by what standards online communities should operate, that allow these spaces to work at all.

Our online relationships aren’t dead, but our sense of community has become more private: hidden in plain sight, in private Facebook or Slack groups, text chats with friends, we connect in closed spaces that retain the idea of a group of people, bound by shared values, using tech to connect where they otherwise might not be able to. Online communities were supplanted by social media, and for a time we pretended they were the same thing, when in fact they are not.

Social media is the street; the community is the house you step into to meet your friends, and like any house, there are rules: things you wouldn’t do, people you wouldn’t invite it in and a limit on just how many people can fit. We forgot those simple ideas, and now it’s time to remember.

My own online community that took me to Florida was, sadly, subject to the gravity of the social giants. It dissipated, pulled away by the weight of Twitter and Facebook, but also the necessities of work and money and family. Nonetheless, we still connect sometimes, now in new online places, quiet, enclosed groups that the public world can’t see. New communities have sprouted up, too – and I still dive in. I’m not sure I would do so as easily, though, had it not been for what now threatens to be lost: that chance to get on a plane, look down from above and see, from up high, what we share with those scattered around the globe.

That sense of radical possibility is, I think, worth fighting to save."
navneetalang  socialmedia  online  internet  web  anildash  britneysummit-gil  2017  consolidation  tumblr  instagram  twitter  facebook  social  lindywest  snarkmarket  community  gamergate  reddit  scale  typepad  abuse 
may 2017 by robertogreco
danah boyd | apophenia » Hacking the Attention Economy
"The democratization of manipulation

In the early days of blogging, many of my fellow bloggers imagined that our practice could disrupt mainstream media. For many progressive activists, social media could be a tool that could circumvent institutionalized censorship and enable a plethora of diverse voices to speak out and have their say. Civic minded scholars were excited by “smart mobs” who leveraged new communications platforms to coordinate in a decentralized way to speak truth to power. Arab Spring. Occupy Wall Street. Black Lives Matter. These energized progressives as “proof” that social technologies could make a new form of civil life possible.

I spent 15 years watching teenagers play games with powerful media outlets and attempt to achieve control over their own ecosystem. They messed with algorithms, coordinated information campaigns, and resisted attempts to curtail their speech. Like Chinese activists, they learned to hide their traces when it was to their advantage to do so. They encoded their ideas such that access to content didn’t mean access to meaning.

Of course, it wasn’t just progressive activists and teenagers who were learning how to mess with the media ecosystem that has emerged since social media unfolded. We’ve also seen the political establishment, law enforcement, marketers, and hate groups build capacity at manipulating the media landscape. Very little of what’s happening is truly illegal, but there’s no widespread agreement about which of these practices are socially and morally acceptable or not.

The techniques that are unfolding are hard to manage and combat. Some of them look like harassment, prompting people to self-censor out of fear. Others look like “fake news”, highlighting the messiness surrounding bias, misinformation, disinformation, and propaganda. There is hate speech that is explicit, but there’s also suggestive content that prompts people to frame the world in particular ways. Dog whistle politics have emerged in a new form of encoded content, where you have to be in the know to understand what’s happening. Companies who built tools to help people communicate are finding it hard to combat the ways their tools are being used by networks looking to skirt the edges of the law and content policies. Institutions and legal instruments designed to stop abuse are finding themselves ill-equipped to function in light of networked dynamics.

The Internet has long been used for gaslighting, and trolls have long targeted adversaries. What has shifted recently is the scale of the operation, the coordination of the attacks, and the strategic agenda of some of the players.
For many who are learning these techniques, it’s no longer simply about fun, nor is it even about the lulz. It has now become about acquiring power.

A new form of information manipulation is unfolding in front of our eyes. It is political. It is global. And it is populist in nature. The news media is being played like a fiddle, while decentralized networks of people are leveraging the ever-evolving networked tools around them to hack the attention economy.

I only wish I knew what happens next."
danahboyd  communication  attention  propaganda  gaslighting  2017  fakenews  proaganda  manipulation  media  medialiteracy  politics  information  gamergate  memes  lolcats  gabriellacoleman 
january 2017 by robertogreco
Men (Still) Explain Technology to Me: Gender and Education Technology | boundary 2
"There’s that very famous New Yorker cartoon: “On the internet, nobody knows you’re a dog.” The cartoon was first published in 1993, and it demonstrates this sense that we have long had that the Internet offers privacy and anonymity, that we can experiment with identities online in ways that are severed from our bodies, from our material selves and that, potentially at least, the internet can allow online participation for those denied it offline.

Perhaps, yes.

But sometimes when folks on the internet discover “you’re a dog,” they do everything in their power to put you back in your place, to remind you of your body. To punish you for being there. To hurt you. To threaten you. To destroy you. Online and offline.

Neither the internet nor computer technology writ large are places where we can escape the materiality of our physical worlds—bodies, institutions, systems—as much as that New Yorker cartoon joked that we might. In fact, I want to argue quite the opposite: that computer and Internet technologies actually re-inscribe our material bodies, the power and the ideology of gender and race and sexual identity and national identity. They purport to be ideology-free and identity-less, but they are not. If identity is unmarked it’s because there’s a presumption of maleness, whiteness, and perhaps even a certain California-ness. As my friend Tressie McMillan Cottom writes, in ed-tech we’re all supposed to be “roaming autodidacts”: happy with school, happy with learning, happy and capable and motivated and well-networked, with functioning computers and WiFi that works.

By and large, all of this reflects who is driving the conversation about, if not the development of these technology. Who is seen as building technologies. Who some think should build them; who some think have always built them.

And that right there is already a process of erasure, a different sort of mansplaining one might say."



"Ironically—bitterly ironically, I’d say, many pieces of software today increasingly promise “personalization,” but in reality, they present us with a very restricted, restrictive set of choices of who we “can be” and how we can interact, both with our own data and content and with other people. Gender, for example, is often a drop down menu where one can choose either “male” or “female.” Software might ask for a first and last name, something that is complicated if you have multiple family names (as some Spanish-speaking people do) or your family name is your first name (as names in China are ordered). Your name is presented how the software engineers and designers deemed fit: sometimes first name, sometimes title and last name, typically with a profile picture. Changing your username—after marriage or divorce, for example—is often incredibly challenging, if not impossible.

You get to interact with others, similarly, based on the processes that the engineers have determined and designed. On Twitter, you cannot direct message people, for example, that do not follow you. All interactions must be 140 characters or less.

This restriction of the presentation and performance of one’s identity online is what “cyborg anthropologist” Amber Case calls the “templated self.” She defines this as “a self or identity that is produced through various participation architectures, the act of producing a virtual or digital representation of self by filling out a user interface with personal information.”

Case provides some examples of templated selves:
Facebook and Twitter are examples of the templated self. The shape of a space affects how one can move, what one does and how one interacts with someone else. It also defines how influential and what constraints there are to that identity. A more flexible, but still templated space is WordPress. A hand-built site is much less templated, as one is free to fully create their digital self in any way possible. Those in Second Life play with and modify templated selves into increasingly unique online identities. MySpace pages are templates, but the lack of constraints can lead to spaces that are considered irritating to others.


As we—all of us, but particularly teachers and students—move to spend more and more time and effort performing our identities online, being forced to use preordained templates constrains us, rather than—as we have often been told about the Internet—lets us be anyone or say anything online. On the Internet no one knows you’re a dog unless the signup process demanded you give proof of your breed. This seems particularly important to keep in mind when we think about students’ identity development. How are their identities being templated?

While Case’s examples point to mostly “social” technologies, education technologies are also “participation architectures.” Similarly they produce and restrict a digital representation of the learner’s self.

Who is building the template? Who is engineering the template? Who is there to demand the template be cracked open? What will the template look like if we’ve chased women and people of color out of programming?"



"One interesting example of this dual approach that combines both social and technical—outside the realm of ed-tech, I recognize—are the tools that Twitter users have built in order to address harassment on the platform. Having grown weary of Twitter’s refusal to address the ways in which it is utilized to harass people (remember, its engineering team is 90% male), a group of feminist developers wrote The Block Bot, an application that lets you block, en masse, a large list of Twitter accounts who are known for being serial harassers. That list of blocked accounts is updated and maintained collaboratively. Similarly, Block Together lets users subscribe to others’ block lists. Good Game Autoblocker, a tool that blocks the “ringleaders” of GamerGate.

That gets, just a bit, at what I think we can do in order to make education technology habitable, sustainable, and healthy. We have to rethink the technology. And not simply as some nostalgia for a “Web we lost,” for example, but as a move forward to a Web we’ve yet to ever see. It isn’t simply, as Isaacson would posit it, rediscovering innovators that have been erased, it’s about rethinking how these erasures happen all throughout technology’s history and continue today—not just in storytelling, but in code.

Educators should want ed-tech that is inclusive and equitable. Perhaps education needs reminding of this: we don’t have to adopt tools that serve business goals or administrative purposes, particularly when they are to the detriment of scholarship and/or student agency—technologies that surveil and control and restrict, for example, under the guise of “safety”—that gets trotted out from time to time—but that have never ever been about students’ needs at all. We don’t have to accept that technology needs to extract value from us. We don’t have to accept that technology puts us at risk. We don’t have to accept that the architecture, the infrastructure of these tools make it easy for harassment to occur without any consequences. We can build different and better technologies. And we can build them with and for communities, communities of scholars and communities of learners. We don’t have to be paternalistic as we do so. We don’t have to “protect students from the Internet,” and rehash all the arguments about stranger danger and predators and pedophiles. But we should recognize that if we want education to be online, if we want education to be immersed in technologies, information, and networks, that we can’t really throw students out there alone. We need to be braver and more compassionate and we need to build that into ed-tech. Like Blockbot or Block Together, this should be a collaborative effort, one that blends our cultural values with technology we build.

Because here’s the thing. The answer to all of this—to harassment online, to the male domination of the technology industry, the Silicon Valley domination of ed-tech—is not silence. And the answer is not to let our concerns be explained away. That is after all, as Rebecca Solnit reminds us, one of the goals of mansplaining: to get us to cower, to hesitate, to doubt ourselves and our stories and our needs, to step back, to shut up. Now more than ever, I think we need to be louder and clearer about what we want education technology to do—for us and with us, not simply to us."
education  gender  technology  edtech  2015  audreywatters  history  agency  ambercase  gamergate  society  power  hierarchy  harassment  siliconvalley  privilege  safety  collaboration  identity  tressiemcmillancottom  erasure  inclusion  inclusivity  templates  inlcusivity 
april 2015 by robertogreco
Doxxing to Defend Student Privacy
When you’re doxxed, there’s a whistle: you’re now the target. Everything you do; everything you did. It’s fair game now.

Braun and Ravitch and Schneider whistled. They called out a woman for the masses on the Internet to target, to have all the data of her life pulled out, examined aggressively and maliciously. All in the service of protecting students from Pearson. Charles C. Johnson’s whistles call a different crowd, sure, but it’s still a whistle.

Now, thanks to Schneider’ justification that “doxxing is okay,” I wonder if we’ll see a new sort of crowdsourced harassment from these quarters. We’ve already seen folks from that circle go after women of color who worked for the teachers’ unions but who were, because of their demands for racial justice, deemed unruly.

If doxxing is the tactic – and “a primer” sure might indicate that it’s a-okay – then we have much more to do than prepare students how to think through the implications of portfolios or surveillance and discipline. It’s not just “don’t tweet about PARCC,” it’s – gah! – “don’t tweet.”

Seriously, we have to think about what it means when political groups decide to use those social media mechanisms not just to observe and monitor but to destroy their opposition and to stifle dissent. When I wrote my most recent story about privacy and identity development, I admit, I thought I was trying to carve out a space in which I hoped that students were free to be themselves without government or corporate influence. Now, I get to add to that list of organizations students need to protect themselves: the surveillance of well meaning education bloggers, who are willing to shame and doxx in order to sway systems to meet their own personal political machinations.

Congrats. You’re why education can’t have nice things."
audreywatters  2015  doxxing  dianravitch  education  pearson  bobbraun  civility  politics  debate  sexism  mercedesschneider  gamergate  harassment  internet  behavior 
march 2015 by robertogreco
Convivial Tools in an Age of Surveillance
"What would convivial ed-tech look like?

The answer can’t simply be “like the Web” as the Web is not some sort of safe and open and reliable and accessible and durable place. The answer can’t simply be “like the Web” as though the move from institutions to networks magically scrubs away the accumulation of history and power. The answer can’t simply be “like the Web” as though posting resources, reference services, peer-matching, and skill exchanges — what Illich identified as the core of his “learning webs” — are sufficient tools in the service of equity, freedom, justice, or hell, learning.

“Like the Web” is perhaps a good place to start, don’t get me wrong, particularly if this means students are in control of their own online spaces — its content, its data, its availability, its publicness. “Like the Web” is convivial, or close to it, if students are in control of their privacy, their agency, their networks, their learning. We all need to own our learning — and the analog and the digital representations or exhaust from that. Convivial tools do not reduce that to a transaction — reduce our learning to a transaction, reduce our social interactions to a transaction.

I'm not sure the phrase "safe space" is quite the right one to build alternate, progressive education technologies around, although I do think convivial tools do have to be “safe” insofar as we recognize the importance of each other’s health and well-being. Safe spaces where vulnerability isn’t a weakness for others to exploit. Safe spaces where we are free to explore, but not to the detriment of those around us. As Illich writes, "A convivial society would be the result of social arrangements that guarantee for each member the most ample and free access to the tools of the community and limit this freedom only in favor of another member’s equal freedom.”

We can’t really privilege “safe” as the crux of “convivial” if we want to push our own boundaries when it comes to curiosity, exploration, and learning. There is risk associated with learning. There’s fear and failure (although I do hate how those are being fetishized in a lot of education discussions these days, I should note.)

Perhaps what we need to build are more compassionate spaces, so that education technology isn’t in the service of surveillance, standardization, assessment, control.

Perhaps we need more brave spaces. Or at least many educators need to be braver in open, public spaces -- not brave to promote their own "brands" but brave in standing with their students. Not "protecting them” from education technology or from the open Web but not leaving them alone, and not opening them to exploitation.

Perhaps what we need to build are more consensus-building not consensus-demanding tools. Mike Caulfield gets at this in a recent keynote about “federated education.” He argues that "Wiki, as it currently stands, is a consensus *engine*. And while that’s great in the later stages of an idea, it can be deadly in those first stages.” Caulfield relates the story of the Wikipedia entry on Kate Middleton’s wedding dress, which, 16 minutes after it was created, "someone – and in this case it probably matters that is was a dude – came and marked the page for deletion as trivial, or as they put it 'A non-notable article incapable of being expanded beyond a stub.’” Debate ensues on the entry’s “talk” page, until finally Jimmy Wales steps in with his vote: a “strong keep,” adding "I hope someone will create lots of articles about lots of famous dresses. I believe that our systemic bias caused by being a predominantly male geek community is worth some reflection in this context.”

Mike Caulfield has recently been exploring a different sort of wiki, also by Ward Cunningham. This one — called the Smallest Federated Wiki — doesn’t demand consensus like Wikipedia does. Not off the bat. Instead, entries — and this can be any sort of text or image or video, it doesn’t have to “look like” an encyclopedia — live on federated servers. Instead of everyone collaborating in one space on one server like a “traditional” wiki, the work is distributed. It can be copied and forked. Ideas can be shared and linked; it can be co-developed and co-edited. But there isn’t one “vote” or one official entry that is necessarily canonical.

Rather than centralized control, conviviality. This distinction between Wikipedia and Smallest Federated Wiki echoes too what Illich argued: that we need to be able to identify when our technologies become manipulative. We need "to provide guidelines for detecting the incipient stages of murderous logic in a tool; and to devise tools and tool systems that optimize the balance of life, thereby maximizing liberty for all."

Of course, we need to recognize, those of us that work in ed-tech and adopt ed-tech and talk about ed-tech and tech writ large, that convivial tools and a convivial society must go hand-in-hand. There isn’t any sort of technological fix to make education better. It’s a political problem, that is, not a technological one. We cannot come up with technologies that address systematic inequalities — those created by and reinscribed by education— unless we are willing to confront those inequalities head on. Those radical education writers of the Sixties and Seventies offered powerful diagnoses about what was wrong with schooling. The progressive education technologists of the Sixties and Seventies imagined ways in which ed-tech could work in the service of dismantling some of the drudgery and exploitation.

But where are we now? Instead we find ourselves with technologies working to make that exploitation and centralization of power even more entrenched. There must be alternatives — both within and without technology, both within and without institutions. Those of us who talk and write and teach ed-tech need to be pursuing those things, and not promoting consumption and furthering institutional and industrial control. In Illich’s words: "The crisis I have described confronts people with a choice between convivial tools and being crushed by machines.""
toolforconviviality  ivanillich  audreywatters  edtech  technology  education  2014  seymourpapert  logo  alankay  dynabook  mikecaufield  wardcunningham  web  internet  online  schools  teaching  progressive  wikipedia  smallestfederatedwiki  wikis  society  politics  policy  decentralization  surveillance  doxxing  gamergate  drm  startups  venturecapital  bigdata  neilpostman  paulofreire  paulgoodman  datapalooza  knewton  computers  computing  mindstorms  control  readwrite  everettreimer  1960s  1970s  jonathankozol  disruption  revolution  consensus  safety  bravery  courage  equity  freedom  justice  learning 
november 2014 by robertogreco

Copy this bookmark:





to read