recentpopularlog in

charlesarthur : society   18

The problems with risk assessment tools • The New York Times
Chelsea Barabas, Karthik Dinakar and Colin Doyle:
<p>Algorithmic risk assessments are touted as being more objective and accurate than judges in predicting future violence. Across the political spectrum, these tools have become the darling of bail reform. But their success rests on the hope that risk assessments can be a valuable course corrector for judges’ faulty human intuition.

When it comes to predicting violence, risk assessments offer more magical thinking than helpful forecasting. We and other researchers <a href="">have written a statement about the fundamental technical flaws</a> with these tools.

Risk assessments are virtually useless for identifying who will commit violence if released pretrial. Consider the pre-eminent risk assessment tool on the market today, the Public Safety Assessment, or P.S.A., adopted in New Jersey, Kentucky and various counties across the country. In these jurisdictions, the P.S.A. assesses every person accused of a crime and flags them as either at risk for “new violent criminal activity” or not. A judge sees whether the person has been flagged for violence and, depending on the jurisdiction, may receive an automatic recommendation to release or detain.

Risk assessments’ simple labels obscure the deep uncertainty of their actual predictions. Largely because pretrial violence is so rare, it is virtually impossible for any statistical model to identify people who are more likely than not to commit a violent crime.

The P.S.A. predicts that 92% of the people that the algorithm flags for pretrial violence will not get arrested for a violent crime. The fact is, a vast majority of even the highest-risk individuals will not commit a violent crime while awaiting trial.</p>

The trio of authors are experts in the topic, based at MIT and Harvard, and note that "There are more legally innocent people behind bars in America today than there were convicted people in jails and prisons in 1980."
machinelearning  society  prison 
7 weeks ago by charlesarthur
No flights, a four-day week and living off-grid: what climate scientists do at home to save the planet • The Guardian
Alison Green is one of many academics interviewed for this piece:
<p> In July 2018, I came across Prof Jem Bendell’s <a href="">Deep Adaptation paper</a>, which was going viral online. Here was someone with credibility and a good track record who, having studied the science, was saying that we’re no longer looking at mitigation, we’re looking at adaptation; that societal collapse is inevitable.

People are starting to talk about the kind of spiritual awakening you get in these situations: an “ecophany”. I concluded that banging on about climate change on social media was not enough, and became involved with grassroots activism. Being a vice-chancellor no longer meant anything to me. I gave up my career, and I’m so much happier as a result. Now I talk at conferences and events about the need for urgent action and I have taken part in direct actions with Extinction Rebellion, including the closing of five London bridges last November and speaking in Parliament Square during the April rebellion.

The science shows that societal collapse could be triggered by any one of a number of things, and once triggered, it could happen quite quickly. I suppose I’m being protective towards my four children, aged between 16 and 24, but in the event, I feel I need to be somewhere where I’m growing my own food, living in an eco-house, trying to live off-grid. It would give me some security; I don’t feel secure where I live in Cambridge at the moment – I’m concerned by thoughts like, “What would happen if I turned the tap on and there was no water?”. On our current trajectory, cities will not necessarily be safe places in the future – possibly within my own lifetime, certainly within my children’s.</p>

Societal collapse. Just a phrase to roll around your head.
society  globalheating  globalwarming  climatecrisis  climatechange 
10 weeks ago by charlesarthur
The '3.5% rule': how a small minority can change the world • BBC Future
David Robson:
<p>In 2003, the people of Georgia ousted Eduard Shevardnadze through the bloodless Rose Revolution, in which protestors stormed the parliament building holding the flowers in their hands.

Earlier this year, the presidents of Sudan and Algeria both announced they would step aside after decades in office, thanks to peaceful campaigns of resistance.  

In each case, civil resistance by ordinary members of the public trumped the political elite to achieve radical change.

There are, of course, many ethical reasons to use nonviolent strategies. But compelling research by Erica Chenoweth, a political scientist at Harvard University, confirms that civil disobedience is not only the moral choice; it is also the most powerful way of shaping world politics – by a long way.
Looking at hundreds of campaigns over the last century, Chenoweth found that nonviolent campaigns are twice as likely to achieve their goals as violent campaigns. And although the exact dynamics will depend on many factors, she has shown it takes around 3.5% of the population actively participating in the protests to ensure serious political change.

Chenoweth’s influence can be seen in the recent Extinction Rebellion protests, whose founders say they have been directly inspired by her findings. So just how did she come to these conclusions?</p>

3.5% of the UK's 63m population would be 2.2m people; of the US's 330m would be 11.55m, though if you're only talking adults, then it's smaller: 1.8m and 8m. Which leads us on to the next link…
society  revolution 
may 2019 by charlesarthur
How do we go on? • ANU Science
Tabitha Carvan on how to deal with climate despair - the feeling that nothing you can do will make a difference:
<p>“The neoliberal economic system we’ve bought into is completely at odds with how the Earth works,” Professor Will Steffen continues. “We have to change this value system that we operate under. We need a social tipping point that flips our thinking, before we reach a tipping point in the climate system.

“I think Greta Thunberg could turn out to be that tipping element.”

But Greta, the sixteen year-old Swedish activist, hasn’t made a dent on the problem, I say.

“Not yet,” Steffen says. “The thing about a complex system, like our societies, is they are hard to predict because they’re highly non-linear. It’s not simple cause and effect. The state of the system – that is, the neoliberal economic system and our use of fossil fuels – seems so set, so stable, so tough, that nothing’s going to affect it. But it’s getting eroded from underneath - by the students, by legal battles, by increasing extreme weather events.

“Where you have a lot of people waking up and saying, ‘Something isn’t right’, that could be the kind of fundamental thing we need to reach the tipping point. It’s not just the students. I think more people are beginning to sense that too. For the first time, I’m seeing old white men in the bush saying something is changing there too.

“I’m not saying we’re now going to solve climate change but I’m saying we are getting to a point where reaching that kind of social tipping point is our only hope. The solutions are already there. It’s the system that’s preventing it.”</p>
climatechange  society  economics 
may 2019 by charlesarthur
Privacy rights and data collection in a digital economy • Idle Words
Maciej Cieglowski, who runs the Pinboard service but is also one of the clearest thinkers on the state of the internet, gave evidence last week to the US Congress. As you'd expect, it's a must-read:
<p>Until recently, even people living in a police state could count on the fact that the authorities didn’t have enough equipment or manpower to observe everyone, everywhere, and so enjoyed more freedom from monitoring than we do living in a free society today. [Note: The record for intensive surveillance in the pre-internet age likely belongs to East Germany, where by some estimates one in seven people was an informant.].

A characteristic of this new world of ambient surveillance is that we cannot opt out of it, any more than we might opt out of automobile culture by refusing to drive. However sincere our commitment to walking, the world around us would still be a world built for cars. We would still have to contend with roads, traffic jams, air pollution, and run the risk of being hit by a bus.

Similarly, while it is possible in principle to throw one’s laptop into the sea and renounce all technology, it is no longer be possible to opt out of a surveillance society.

When we talk about privacy in this second, more basic sense, the giant tech companies are not the guardians of privacy, but its gravediggers.

The tension between these interpretations of what privacy entails, and who is trying to defend it, complicates attempts to discuss regulation.

Tech companies will correctly point out that their customers have willingly traded their private data for an almost miraculous collection of useful services, services that have unquestionably made their lives better, and that the business model that allows them to offer these services for free creates far more value than harm for their customers.

Consumers will just as rightly point out that they never consented to be the subjects in an uncontrolled social experiment, that the companies engaged in reshaping our world have consistently refused to honestly discuss their business models or data collection practices, and that in a democratic society, profound social change requires consensus and accountability.</p>
Surveillance  society  google  facebook 
may 2019 by charlesarthur
Global warning: Nick Harkaway on Gnomon • Waterstones
Nick Harkaway's novel Gnomon began in 2014 as a rumination on a surveillance society, but realityt is starting to outpace its author:
<p>there’s an outer ring of surveillance which has also emerged over the last few years, and which can do similar tricks: the ring of data. We leave a trail through the world, of Internet history and store cards and credit cards and Oyster cards. Your supermarket knows you’re worried about staying healthy because you buy vitamins; they know you’re trying to get pregnant because you’ve changed your purchase; that you’ve succeeded; that it’s a boy; that it’s a girl; that it’s twins. Or they know to a reasonable percentage of certainty, and their model of you changes with those assumptions so that you get offered different things. So that they can persuade you to buy things. So that they can, to some degree, control your choices.

So that you can be "ground honest" [the purpose of the panopticon] - or in this case, ground into buying a more expensive brand of formula milk. Or trainers. Or birth control. Whatever it is that you want, they know you want it - sometimes before you do.

The inner and outer layers of surveillance - the brain and the cloud - give away intimate secrets. They allow the state and the commercial sector to know things which, if someone were simply watching you with a long lens, you would consider grossly inappropriate and probably criminal.

And these things are in their infancy. They have barely begun to take hold. A decade ago we swam in a sea of chaotic data and our minds were opaque. The day after tomorrow we’ll be, effectively, in a transparent glass tank, and our minds will legible. Employers - already keen to watch workers in the workplace both physically and digitally - will begin to ask you to sit for direct assessments. Are you loyal, enthused, considering a move? Are you thinking of joining a union? Starting one? Are you a troublemaker? Are your values in line with the company’s? 

Before you say "that will never happen," stop and understand that to a great extent it already does in many industries, just without the new technology to make it more straightforward.</p>
Surveillance  society 
july 2018 by charlesarthur
We are all public figures now • Ella Dawson
<p>I don’t think there is any such thing as a “private person” anymore. The vast majority of us constantly groom our internet presence, choosing the right filter on Instagram for our brunch and taking polls of our friends about our next Facebook profile picture. We don’t think about this as a public act when we have only 400 connections on LinkedIn or 3,000 followers on Tumblr. No one imagines the Daily Mail write-up or the Jezebel headline. We actively create our public selves, every day, one social media post at a time. Little kids dream of becoming famous YouTubers the same way I wanted to be a published author when I was twelve.

But there are also those of us who don’t choose this. We keep our accounts locked, our Instagram profile set to “friends only.” Maybe we learned a lesson when a post took off and left the safe haven of our community, picked apart in a horrifying display of context collapse. Maybe we are hiding from something: a stalker, an abusive ex, our family members who don’t know our true queer identity. To some of us, privacy is as vital as oxygen. Without it we are exposed—butterflies with our wings pinned to the corkboard, our patterns scrutinized under a magnifying glass. For what? For entertainment? For someone else’s mid-workday escapism? For a starring role in someone else’s bastardized rom com?

A woman boarded a plane in New York and stepped off that plane in Dallas. She chatted with a stranger, showed him some family photos, brushed his elbow with her own. She wore a baseball cap over her face and followed him back on Instagram. At no point did she agree to participate in the story Rosey Blair was telling. After the fact, when the hunt began and the woman took no part in encouraging it the way Holden did, Blair tweeted a video in which she drawled, “We don’t have the gal’s permish yet, not yet y’all, but I’m sure you guys are sneaky, you guys might…”</p>
privacy  behaviour  society  socialwarming 
july 2018 by charlesarthur
Apple's Airpods are an omen • The Atlantic
Ian Bogost:
<p>The AirPods do look a little ridiculous. White sprouts hang down an inch below the ears where the cords would attach. Those with longer hair, like me, can obscure them partially, at least, for the time being. But eventually it won’t matter, as people will get used to everyone having wireless buds stuck in their heads. Not like they’re used to wired earbuds, in the train or on the sidewalk or at the dog park. No, more like they’re used to people staring at phones all the time, anywhere. The earbuds won’t disappear, just like the smartphones haven’t. But they will become invisible as they become ubiquitous. Human focus, already ambiguously cleft between world and screen, will become split again, even when maintaining eye contact.

There are some consequences to this scenario, if it plays out. For one, earbuds will cease to perform any social signaling whatsoever. Today, having one’s earbuds in while talking suggests that you are on a phone call, for example. Having them in while silent is a sign of inner focus—a request for privacy. That’s why bothering someone with earbuds in is such a social faux-pas: They act as a do-not-disturb sign for the body. But if AirPods or similar devices become widespread, those cues will vanish. Everyone will exist in an ambiguous state between public engagement with a room or space and private retreat into devices or media.

The smartphone’s own excesses might accelerate the matter. In Georgia, where I live, a new law intended to reduce distracted driving goes into effect on July 1. The law prohibits holding a phone while driving. There are exceptions, including operating a mapping app, but ambiguities of actual use (and fears that police might use it as an excuse for citing other infractions) might push more drivers to newer, better hands-free options. AirPods are expensive, but they’re a lot cheaper than traffic infractions or insurance hikes.</p>

I used the headline from the web page itself, rather than the header text - "Are Apple's AirPods any good?", which is an absurd bit of clickbaity nonsense. Bogost is posing a bigger question: what happens when you can't tell if someone is paying attention to you or not? It used to be that someone walking alone down the street talking aloud was unhinged. Now, it's more likely they're on the phone. Social judgement shifts. Technology shapes society.
apple  airpods  society 
june 2018 by charlesarthur
A simple theory of Moore's Law and social media • Marginal REVOLUTION
Tyler Cowen, here in part:
<p>6. Consider a second distinction, namely between people who are too sensitive to social information, and people who are relatively insensitive to social information.  A quick test of this one is to ask how often a person’s tweets (and thoughts) refer to the motivations, intentions, or status hierarchies held by others.  Get the picture?  (Here is an A+ example.)

7. People who are overly sensitive to social information will be driven to distraction by Twitter.  They will find the world to be intolerably bad.  The status distinctions they value will be violated so, so many times, and in a manner which becomes common knowledge.  And they will perceive what are at times the questionable motives held by others.  Twitter is like negative catnip for them.  In fact, they will find it more and more necessary to focus on negative social information, thereby exacerbating their own tendencies toward oversensitivity.

8. People who are not so sensitive to social information will pursue social media with greater equanimity, and they may find those media productivity-enhancing.  Nevertheless they will become rather visibly introduced to a relatively new category of people for them — those who are overly sensitive to social information.  This group will become so transparent, so in their face, and also somewhat annoying.  Even those extremely insensitive to social information will not be able to help perceiving this alternate approach, and also the sometimes bad motivations that lie behind it.  The overly sensitive ones in turn will notice that another group is under-sensitive to the social considerations they value.  These two groups will think less and less of each other.  The insensitive will have been made sensitive.  It’s like playing “overrated vs. underrated” almost 24/7 on issues you really care about, and which affect your own personal status.

9. The philosophy of Stoicism will return to Silicon Valley.  It will gain adherents but fail, because the rest of the system is stacked against it.

10. The socially sensitive, very smart people will become the most despairing, the most manipulated, and the most angry.  The socially insensitive will either jump ship into the camp of the socially sensitive, or they will cultivate new methods of detachment, with or without Stoicism.  Straussianism will compete with Stoicism.</p>

An excellent analysis. (There are 13 points in all.) Though I think it's Metcalfe's law, relating to networks, that's more relevant than Moore's.
socialmedia  society 
december 2017 by charlesarthur
The despair of learning that experience no longer matters • The New Yorker
Benjamin Wallace-Wells:
<p>The return to experience is a way to describe what you get in return for aging. It describes the increase in wages that workers normally see throughout their careers. The return to experience tends to be higher for more skilled jobs: a doctor might expect the line between what she earns in her first year and what she earns in her fifties to rise in a satisfyingly steady upward trajectory; a coal miner might find it depressingly flat. But even workers with less education and skills grow more efficient the longer they hold a job, and so paying them more makes sense. Unions, in arguing for pay that rises with seniority, invoke a belief in the return to experience. It comes close to measuring what we might otherwise call wisdom.

“This decline in the return to experience closely matches the decline in attachment to the labor force,” Case and Deaton wrote. “Our data are consistent with a model in which the decline in real wages led to a reduction in labor force participation, with cascading effects on marriage, health, and mortality from deaths of despair.”

The return to experience is not the best-known economic concept, but it is alive in most of our contemporary economic spook stories, in which the callow private-equity analyst has the final power over an industry in which people have long labored, in which the mechanical robot replaces the assembly-line worker, in which the doctor finds his diagnosis corrected by artificial intelligence. It seemed to match at least one emotional vein that ran through the Trump phenomenon, and the more general alienation of the heartland: people are aging, and they are not getting what they think they have earned.</p>

Not just Trumpland, I think.
society  work  labour  experience 
april 2017 by charlesarthur
Parable of the Polygons - a playable post on the shape of society
Neat visualisation by Vi Hart and Nicky Case: a playable system which shows what happens to a society when people are only a tiny bit racist - sorry, shapist.
games  racism  society 
march 2017 by charlesarthur
Who will command the robot armies? • Idle Words
Maciej Ceglowski, from a talk he gave in Australia:
<p>What both these places [Dubai and Singapore] have in common is that they had some kind of plan. As Walter Sobchak put it, say what you will about social control, at least it's an ethos.

The founders of these cities pursued clear goals and made conscious trade-offs. They used modern technology to work towards those goals, not just out of a love of novelty.

We [in the US], on the other hand, didn't plan a thing.

We just built ourselves a powerful apparatus for social control with no sense of purpose or consensus about shared values.

Do we want to be safe? Do we want to be free? Do we want to hear valuable news and offers?

The tech industry slaps this stuff together in the expectation that the social implications will take care of themselves. We move fast and break things.

Today, having built the greatest apparatus for surveillance in history, we're slow to acknowledge that it might present some kind of threat.</p>

He's never less than thought-provoking, and some of the passing jokes are excellent.
surveillance  society 
november 2016 by charlesarthur
Ageing out of the 25-34 bracket, one app at a time •
Lisa Pollack:
<p>Hardware and software previously used with enthusiasm has become an annoyance. New apps are passing me by. And this isn’t only about not downloading Pokémon Go, thus missing out on the delights of walking into lampposts while trying to catch a Pikachu (which is, I hasten to add, the only character name I know).

Consequently I’m beginning to suspect that, like a child counting the years in notches marking their height, I will increasingly count mine by the number of social media networks that I don’t understand. Already a couple of years ago, a girl I was mentoring tried to tell me about the website “”. The explanation was as arduous for me as I suspect my tutoring on simultaneous equations was for her. I still don’t entirely get it. More recently, my clumsy attempts at understanding and using Snapchat ended in befuddlement. I couldn’t even figure out how to add my contacts and yet almost 50m people have watched the Olympics on it. (In the US, by the way, the app reaches 41 per cent of that existential-crisis-inducing 18-34 age bracket.)

This newfound tech ineptitude is particularly disturbing for someone who is, by and large, an informal tech support colleague in the office. Have a problem with a spreadsheet? Need to connect your computer to a printer? Want to know the best way to get screenshots into presentations or how smartpens work? Then chances are, you’ve emailed me.

In the last four months though, that email will have gone to the solitary computer screen on my desk. Once upon a time, I thought that having six monitors, like a trader at a bank, was the coolest thing ever. Now a second screen stands unused to the side. I’ve even reverted to having a paper to-do list where once it was all online. “When I was your age, I used to use TweetDeck!” I want to shout to selfie-posing Snapchatterers. Because then they’d realise I was once like them. Right?</p>
technology  society 
august 2016 by charlesarthur
Tesla’s dubious claims about autopilot’s safety record • Technology Review
Tom Simonite:
<p>Tesla and Musk’s message is clear: the data proves Autopilot is much safer than human drivers. But experts say those comparisons are worthless, because the company is comparing apples and oranges.

“It has no meaning,” says Alain Kornhauser, a Princeton professor and director of the university’s transportation program, of Tesla’s comparison of U.S.-wide statistics with data collected from its own cars. Autopilot is designed to be used only for highway driving, and may well make that safer, but standard traffic safety statistics include a much broader range of driving conditions, he says.

Tesla’s comparisons are also undermined by the fact that its expensive, relatively large vehicles are much safer in a crash than most vehicles on the road, says Bryant Walker Smith, an assistant professor at the University of South Carolina. He describes comparisons of the rate of accidents by Autopilot with population-wide statistics as “ludicrous on their face.” Tesla did not respond to a request asking it to explain why Musk and the company compare figures from very different kinds of driving.</p>

As Ben Thompson also pointed out in his Stratechery newsletter, the fact that Tesla opened its blogpost about this death <em>significantly caused by its technology</em> with statistics, rather than an expression of empathy for the dead person and those affected, is an indictment of its tone-deafness.
selfdrivingcar  tesla  society 
july 2016 by charlesarthur
We’re using technology wrong- how easy wins over better • LinkedIn
Tom Goodwin:
<p>From Twitter for customer service, a way to very quickly be totally unhelpful to people and give them a phone number to call, to Amazon, the world’s easiest way to not necessarily pay the best price, we’re taking average-quality pictures on phones, carelessly but it’s better than having to think — and a filter will probably make it good and if we take enough one will be OK. We’re consistently choosing sites, apps and experiences that are just about good enough, but totally easy.

It’s not totally new. We’ve done it for years with fast food, the easy but poor choice for decades. IKEA made billions on the back of “it will do,” but what was once exceptional moments have become the general pillar for life. Rather than looking at the channel guide or recording a show, or asking a friend, we’re watching the cute rabbit clip, the man falling down a hole or how did that not break her legs fed to us without a click by Facebook, YouTube or Snapchat.

We’re rushing around museums to find the money shot to put on Instagram, running around Barcelona to broadcast the cool Gaudi thing and checking into Soho house, life has become orienteering for status.

We’ve become passive, our lives endless easy experiences that don’t touch the sides. Where does it end? Will self-driving cars mean we no longer care about car performance? Will we be so transfixed by our phones that we don’t need to choose the posh hotel, or maybe even bother to fly somewhere? Will our lives become products we subscribe to on Amazon so we don’t need to think, music fed to us by software and shows that autostart?</p>

The pursuit of "better" rather than "easier" is surprisingly hard, and usually - counterintuitively - more expensive. (You'd expect "easier" would have a price attached.)
june 2016 by charlesarthur
The worst thing I read this year, and what it taught me… or Can we design sociotechnical systems… • Medium
Ethan Zuckerman:
<p>I found Shane Snow’s essay on prison reform — “<a href="">How Soylent and Oculus Could Fix the Prison System</a>” — through hatelinking. Friends of mine hated the piece so much that normally articulate people were at a loss for words.

<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">A real person thought it would be a good idea to write this and post it on the Internet. <a href=""></a></p>&mdash; Susie Cagle (@susie_c) <a href="">January 30, 2016</a>
<script async src="//" charset="utf-8"></script>

With a recommendation like that, how could I pass it up? And after reading it, I tweeted my astonishment to Susie, who told me, “I write comics, but I don’t know how to react to this in a way that’s funny.” I realized that I couldn’t offer an appropriate reaction in 140 characters either. The more I think about Snow’s essay, the more it looks like the outline for a class on the pitfalls of solving social problems with technology, a class I’m now planning on teaching this coming fall.</p>

Zuckerman wonders "is it possible to get beyond both a naïve belief that the latest technology will solve social problems and a reaction that rubbishes any attempt to offer novel technical solutions as inappropriate, insensitive and misguided?"
technology  society 
june 2016 by charlesarthur
What does your reaction to a robotic trash can say about you? » Atlas Obscura
Cara Giamo:
<p>Imagine you’re in a cafeteria, finishing up a bag of chips and chatting with some friends. You’re beginning to think about getting up to throw away your wrapper, when—suddenly—the nearest trash barrel approaches you instead. It rolls back and forth, and wiggles briefly. It is, it seems, at your service.

How do you respond?</p>

Like this: <iframe src="" width="100%" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
<p>The trash barrel has delivered some particularly unique insights. First of all, Sirkin and Ju say, it highlights how good people are at subtly refusing to acknowledge interactions they don’t want or need—a behavior the team has dubbed “unteracting.” If the trash barrel approaches a table of people, and they have no trash to give it, they generally won’t shoo it off. They’ll just steadfastly ignore it until it rolls away again. “They’re using their gaze as a tool for deciding when they’re engaging or not,” says Ju. (You can see this about halfway through the video, when a man on a cell phone refuses to look at the barrel until it backs off.)

On the other hand, people who did make use of the barrel felt miffed when it didn’t respond more. “People kind of expected it to thank them,” says Sirkin. “They’ll say ‘I fed the robot, and it didn’t thank me, and that was insulting.'” Some would also whistle for it, or dangle trash in front of it enticingly.</p>
robots  tech  society 
march 2016 by charlesarthur
Trump supporters aren’t stupid » Medium
Emma Lindsay with a terrific insight:
<p>Normally, when liberals talk about racism, they use “racist” as an end point. “Trump is racist” is, by itself, a reason not to vote for him, and “being racist” is an indicator of a person who is morally deficient.

But, if you don’t take this as an end point — if you instead ask “what do people get out of being racist?” — you’ll start to unravel the emotional motivations behind it. One of the best unpacking of this I have read is Matt Bruenig’s piece Last Place Avoidance and Poor White Racism. To summarize, no one wants to occupy the “last” place in society. No one wants to be the most despised. As long as racism remains intact, poor white people are guaranteed not to be “the worst.” If racism is ever truly dismantled, then poor white people will occupy the lowest rung of society, and the shame of occupying this position is very painful. This shame is so painful, that the people at risk of feeling it will vote on it above all other issues.</p>

And as she also points out, "America is <em>terrible</em> at giving its citizens dignity and meaning." This should be required reading in many places.
us  trump  society 
march 2016 by charlesarthur

Copy this bookmark:

to read