recentpopularlog in

robertogreco : sarawachter-boettcher   5

No one’s coming. It’s up to us. – Dan Hon – Medium
"Getting from here to there

This is all very well and good. But what can we do? And more precisely, what “we”? There’s increasing acceptance of the reality that the world we live in is intersectional and we all play different and simultaneous roles in our lives. The society of “we” includes technologists who have a chance of affecting the products and services, it includes customers and users, it includes residents and citizens.

I’ve made this case above, but I feel it’s important enough to make again: at a high level, I believe that we need to:

1. Clearly decide what kind of society we want; and then

2. Design and deliver the technologies that forever get us closer to achieving that desired society.

This work is hard and, arguably, will never be completed. It necessarily involves compromise. Attitudes, beliefs and what’s considered just changes over time.

That said, the above are two high level goals, but what can people do right now? What can we do tactically?

What we can do now

I have two questions that I think can be helpful in guiding our present actions, in whatever capacity we might find ourselves.

For all of us: What would it look like, and how might our societies be different, if technology were better aligned to society’s interests?

At the most general level, we are all members of a society, embedded in existing governing structures. It certainly feels like in the recent past, those governing structures are coming under increasing strain, and part of the blame is being laid at the feet of technology.

One of the most important things we can do collectively is to produce clarity and prioritization where we can. Only by being clearer and more intentional about the kind of society we want and accepting what that means, can our societies and their institutions provide guidance and leadership to technology.

These are questions that cannot and should not be left to technologists alone. Advances in technology mean that encryption is a societal issue. Content moderation and censorship are a societal issue. Ultimately, it should be for governments (of the people, by the people) to set expectations and standards at the societal level, not organizations accountable only to a board of directors and shareholders.

But to do this, our governing institutions will need to evolve and improve. It is easier, and faster, for platforms now to react to changing social mores. For example, platforms are responding in reaction to society’s reaction to “AI-generated fake porn” faster than governing and enforcing institutions.

Prioritizations may necessarily involve compromise, too: the world is not so simple, and we are not so lucky, that it can be easily and always divided into A or B, or good or not-good.

Some of my perspective in this area is reflective of the schism American politics is currently experiencing. In a very real way, America, my adoptive country of residence, is having to grapple with revisiting the idea of what America is for. The same is happening in my country of birth with the decision to leave the European Union.

These are fundamental issues. Technologists, as members of society, have a point of view on them. But in the way that post-enlightenment governing institutions were set up to protect against asymmetric distribution of power, technology leaders must recognize that their platforms are now an undeniable, powerful influence on society.

As a society, we must do the work to have a point of view. What does responsible technology look like?

For technologists: How can we be humane and advance the goals of our society?

As technologists, we can be excited about re-inventing approaches from first principles. We must resist that impulse here, because there are things that we can do now, that we can learn now, from other professions, industries and areas to apply to our own. For example:

* We are better and stronger when we are together than when we are apart. If you’re a technologist, consider this question: what are the pros and cons of unionizing? As the product of a linked network, consider the question: what is gained and who gains from preventing humans from linking up in this way?

* Just as we create design patterns that are best practices, there are also those that represent undesired patterns from our society’s point of view known as dark patterns. We should familiarise ourselves with them and each work to understand why and when they’re used and why their usage is contrary to the ideals of our society.

* We can do a better job of advocating for and doing research to better understand the problems we seek to solve, the context in which those problems exist and the impact of those problems. Only through disciplines like research can we discover in the design phase — instead of in production, when our work can affect millions — negative externalities or unintended consequences that we genuinely and unintentionally may have missed.

* We must compassionately accept the reality that our work has real effects, good and bad. We can wish that bad outcomes don’t happen, but bad outcomes will always happen because life is unpredictable. The question is what we do when bad things happen, and whether and how we take responsibility for those results. For example, Twitter’s leadership must make clear what behaviour it considers acceptable, and do the work to be clear and consistent without dodging the issue.

* In America especially, technologists must face the issue of free speech head-on without avoiding its necessary implications. I suggest that one of the problems culturally American technology companies (i.e., companies that seek to emulate American culture) face can be explained in software terms. To use agile user story terminology, the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end, and accepting that free speech is a means involves the hard work of considering and taking a clear, understandable position as to what ends.

* We have been warned. Academics — in particular, sociologists, philosophers, historians, psychologists and anthropologists — have been warning of issues such as large-scale societal effects for years. Those warnings have, bluntly, been ignored. In the worst cases, those same academics have been accused of not helping to solve the problem. Moving on from the past, is there not something that we technologists can learn? My intuition is that post the 2016 American election, middle-class technologists are now afraid. We’re all in this together. Academics are reaching out, have been reaching out. We have nothing to lose but our own shame.

* Repeat to ourselves: some problems don’t have fully technological solutions. Some problems can’t just be solved by changing infrastructure. Who else might help with a problem? What other approaches might be needed as well?

There’s no one coming. It’s up to us.

My final point is this: no one will tell us or give us permission to do these things. There is no higher organizing power working to put systemic changes in place. There is no top-down way of nudging the arc of technology toward one better aligned with humanity.

It starts with all of us.

Afterword

I’ve been working on the bigger themes behind this talk since …, and an invitation to 2017’s Foo Camp was a good opportunity to try to clarify and improve my thinking so that it could fit into a five minute lightning talk. It also helped that Foo Camp has the kind of (small, hand-picked — again, for good and ill) influential audience who would be a good litmus test for the quality of my argument, and would be instrumental in taking on and spreading the ideas.

In the end, though, I nearly didn’t do this talk at all.

Around 6:15pm on Saturday night, just over an hour before the lightning talks were due to start, after the unconference’s sessions had finished and just before dinner, I burst into tears talking to a friend.

While I won’t break the societal convention of confidentiality that helps an event like Foo Camp be productive, I’ll share this: the world felt too broken.

Specifically, the world felt broken like this: I had the benefit of growing up as a middle-class educated individual (albeit, not white) who believed he could trust that institutions were a) capable and b) would do the right thing. I now live in a country where a) the capability of those institutions has consistently eroded over time, and b) those institutions are now being systematically dismantled, to add insult to injury.

In other words, I was left with the feeling that there’s nothing left but ourselves.

Do you want the poisonous lead removed from your water supply? Your best bet is to try to do it yourself.

Do you want a better school for your children? Your best bet is to start it.

Do you want a policing policy that genuinely rehabilitates rather than punishes? Your best bet is to…

And it’s just. Too. Much.

Over the course of the next few days, I managed to turn my outlook around.

The answer, of course, is that it is too much for one person.

But it isn’t too much for all of us."
danhon  technology  2018  2017  johnperrybarlow  ethics  society  calltoaction  politics  policy  purpose  economics  inequality  internet  web  online  computers  computing  future  design  debchachra  ingridburrington  fredscharmen  maciejceglowski  timcarmody  rachelcoldicutt  stacy-marieishmael  sarahjeong  alexismadrigal  ericmeyer  timmaughan  mimionuoha  jayowens  jayspringett  stacktivism  georginavoss  damienwilliams  rickwebb  sarawachter-boettcher  jamebridle  adamgreenfield  foocamp  timoreilly  kaitlyntiffany  fredturner  tomcarden  blainecook  warrenellis  danhill  cydharrell  jenpahljka  robinray  noraryan  mattwebb  mattjones  danachisnell  heathercamp  farrahbostic  negativeexternalities  collectivism  zeyneptufekci  maciejcegłowski 
february 2018 by robertogreco
Sara Wachter-Boettcher | Talk: Design for Real Life
[video: https://vimeo.com/177313497 ]

"Lots of people have weird backgrounds and diverse backgrounds. And the thing is, all of us could have made those design mistakes. Any one of us could have had a scenario where we didn’t think about it, and we made an assumption, and we built it in. Because we’re so used to thinking about our target audience as some sort of narrow, easy-to-imagine thing, somebody we can picture, right? And to be honest, if you’re white and straight and cis—speaking as somebody who is—it’s really easy to imagine that the world is full of people like you. It’s really easy to imagine that, because, like, you see people like you all the time in your social circle and on TV. And it’s really easy to forget how diverse the world really is.

So we all have these blind spots. And the only way to change that, the only way to get around that, is to do the work. And to admit it, to own up to it and say, yeah—yeah, I have bias. And it’s my job to figure that out and do the best I can to get rid of it.

Because if we don’t, and if we don’t also do the work of making our teams and our industry more diverse, more welcoming to people who are different than us, then what we’ll start to do is we’ll start to build exclusion in. An interface that doesn’t support gay people or doesn’t support people of color leads to data that doesn’t represent gay people or doesn’t represent people of color. And that has a domino effect across an entire system.

And so I think back to that example with Google images, right, with their image recognition, and I think about the machine learning that people are really excited about—and should be, because it’s amazing—and I want to remind us all: machines learn from us. They’re really good at it, actually. So we have to be really careful about what we’re teaching them. Because they’re so good at learning from us, that if we teach them bias, they’ll perform bias exceptionally well. And that’s a job that I think all of us actually play a role in, even if it seems distant at this moment."



"Design for real life

But we can do that. I think we really do our best work when we take a moment and we say, how could this be used to hurt someone? How can we plan for the worst? And that’s what I mean when I talk about designing for real life, because real life is imperfect. Real life is biased. Real life can be harmful to people.

Real life has a hell of a lot of problems.

So what I want to leave you with today is one last story that shows just how much design and content can affect people, can affect what happens in their lives.

It actually takes it back offline to standardized tests. I’m sure many of you have taken tests like this in the past with the little Scantron; you fill in the bubbles. In the United States, we take the SATs—many people take the SATs toward the end of high school as a major part of their college entrance. It plays a huge role in where you might get in.

[40:00]

They have three parts: there’s reading, there’s math and there’s writing. Reading and math are done via this multiple-choice format.

Now, for a very long time, there have been some very big disparities in those scores across race and across gender. White students outscore black students by an average of 100 points on each of those exams. And this is not new. This is about the same margin—it’s been this way for decades. And for boys and girls, you also have this as well. It’s a smaller margin, but you’ve got a little bit of a difference in reading for boys versus girls, and then about a 30-point difference in math.

And what researchers have really started to show is that one of the reasons that this gap is not narrowing—despite all of these other indicators that you would think it might, like the number of women who are going to college and all that, right—it’s not narrowing, because the test is actually biased. Because Education Testing Services, which is the people who write all the questions for the test, what they do is they pretest everything, so potential questions get pretested before they make it to an exam. What that does is it assumes in their testing process that “a ‘good’ question is one that students who score well overall tend to answer correctly, and vice versa.”

So what that means is that if a student who scores well on the current SAT, in the current system with the current disparities, if they tend to do well on this other question, then it’s a good question, and if they don’t, then it’s bad. “So if on a particular math question, girls outscore boys or blacks outscore whites, that question has almost no chance of making the final cut,” because what is happening is that process is perpetuating the disparity that already exists. It’s re-inscribing that disparity over and over again, because it’s making a test perform the same for the people it’s always performed well for, right? The people it was first made for in the ‘20s. People who went to college in the ‘20s, and ‘30s, and ‘40s, and ‘50s. Not the diversity of people who are in college now.

And I tell this story, because this is design, and this is content. What is a test like that, besides content, the questions, and an interface with which a student actually answers it, the test itself? This is what happens when we assume that our work is neutral, when we assume that the way that things have been doesn’t have bias already embedded in it. We allow the problems of the past to re-inscribe themselves over and over again.

And that’s why I think that this is us. This is our work. This is not just the work of, you know, super technical folks, who are involved with AI. This is all of us.

Because ultimately, what we put into interfaces, the way that we design them, what the UI copy says, they affect how people answer questions. They affect what people tell us. They affect how people see themselves. So whether you’re writing a set of questions that a defendant has to fill out that’s going to get them rated as a risk for criminal recidivism, or you’re just explaining how to use a form or establishing default account settings, the interface will affect the input that you get. And the input is going to affect the outcome for users. For people.

The outcomes define norms: what’s perceived as normal and average in our society, the way that we see people. Who counts.

What this means is that design has a lot of power. More power, I think, than we sometimes realize. More power than we sometimes want to believe as we’re sort of like squabbling in our companies about whether we’re being invited to the right meetings. There’s a fundamental truth that design has a lot of power.

And so the question is not whether we have power, but how we’ll use it.

Do we want to design for real people, facing real injustice and real pain? Do we want to make the world a little fairer, a little calmer, and a little safer? Or are we comfortable looking the other way?

I’m not. And so I hope you’ll join me. Thank you."

[via: "Every interface decision encodes culture into the system. So what are we encoding? Video/transcript of my new talk:"
https://twitter.com/sara_ann_marie/status/771736431106678784 ]
bias  diversity  inclusion  inclusivity  sarawachter-boettcher  2016  ui  ux  interface  design  testing  standardization  standardizedtesting  sat 
september 2016 by robertogreco
We (Still) Have Work to Do · An A List Apart Blog Post
"So, what have we done? It’s a fair question, and one that’s worthy of a response. Because the answer is this: everything, and also not nearly enough.

Over the past year, we’ve started discussing inclusivity constantly, across every facet of our work—the authors we encourage, the messaging on our website, the people we invite to events, the way we edit articles, the topics we cover.

And yet, we screw up constantly. We cringe when we notice too late that we published an article with a biased example, or used words that defaulted to male. We struggle to include more people of color and non-native English speakers in our pages. We hear that our submissions copy feels alienating.

We’re trying. But what we haven’t been doing is talking about it publicly—because it takes time, yes, but also because it’s scary to lay bare all our decisions, discussions, half-baked ideas, and partially executed plans. It’s scary to say, “we don’t know all the answers, but here’s where we’ve started.”

That changes today."



"MORE INCLUSIVE EDITING

When we edit, we no longer just look for stuff that violates the style guide: website as one word, or 4g with a lowercase g. We also look for biases and non-inclusive language in the words our authors use, and we challenge them to come up with words that pack power without excluding readers.

It’s not black and white: reasonable people have conflicting opinions on the use of you guys, for example. And some things are so deeply embedded in our culture—like calling things crazy or insane—that’s it’s tough, at first, to even recognize that they’re problematic.

One change you may have noticed, if you’re as nerdy about words as we are, is our move to the singular they. Writing “he” or “she” is fine, if you’re talking about a person who goes by “he” or “she.” But when we talk about a person in general, or someone who doesn’t identify as male or female, they’re now a they.

The most important part of this process is that it’s just that: a process. We haven’t “fixed” our editing style. We’re just having an ongoing conversation that gets more nuanced with time—and that everyone on the team is encouraged to participate in.

Some people might find the prospect of hashing and rehashing language tedious (ugh, do we have to talk about this again?!). But I’ve found it incredibly rewarding, because every discussion forces me to challenge my beliefs and biases—and to be a little more willing to listen."



"We’re also actively reaching out to more prospective authors, and encouraging them to write—especially people of color and women who are just emerging in their fields. Oftentimes, these folks have viewpoints and ideas we haven’t heard before—but they’re more likely to think they’re not “experienced enough” to submit an article. There is no shortage of articles talking about why this happens. The problem is, many of those articles simply end up telling marginalized groups that they’re responsible for solving the problem: here’s the careful tightrope you need to walk in order to promote your ideas without coming off as “pushy,” they seem to say.

We’re not buying it. Women and people of color—and particularly women of color, who often feel sidelined by the largely white “women in tech” movement—already have enough to deal with in this field. The least we can do is put in some effort to reach out to them, rather than complaining that they don’t come to us."



"“So…” So? That tiny word sets a tone of disbelief—like we might as well have added “then prove it” at the end. And don’t get me started on those verbs: challenge, refute, revolutionize. Why are we being so aggressive? What about articles that help our community grow, learn, or improve?

We had good intentions here: we wanted to make readers feel like an ALA article was special—not just a post you whip out in an hour. But it wasn’t working. When I asked people whom I’d like to see submit what they thought, I got responses like, “sending something to ALA sounds scary,” or “that seems like a really big deal.”

Oof.

Writing publicly makes most people feel vulnerable, especially those who are just starting to put their ideas out there for the world—in other words, the very people we’re most interested in hearing from. You might get rejected. People might disagree with you. You might even get harassment or abuse for daring to speak up.

We can’t remove all the risks, but what we can do is offer a more nurturing message to new writers. We started by overhauling our contribute page—in fact, we renamed it Write for Us, with an aim of making the message a little more human."



"Inclusion is a practice

I wish I could say that all these changes have been easy for me. But wanting to be more inclusive and actually doing what it takes to be inclusive aren’t the same. Along the way, I’ve had to let go of some things I was comfortable with, and embrace things I was profoundly uncomfortable with.

For example: I hated the singular they for years. It just didn’t sound right. That’s not how subject-verb agreement works, dammit. Our columns editor, Rose, suggested we start using it forever ago. I vetoed the idea immediately. I edited it out of articles. I insisted authors rewrite examples to avoid it. I stuck to my she and he like they were divinely prescribed.

Only grammar isn’t gospel. It’s culture. Language changes constantly, adapting endlessly to meet the world’s new needs and norms. And that’s what we have right now: a cultural shift toward less gendered thinking, less binary thinking. I wanted the culture change without the language change.

I was wrong.

If someone has a problem with it, they can complain to me."
diversity  gender  language  inclusion  sarawachter-boettcher  alistapart  2015  grammar  workinginpublic  tone  communication  outreach  learning  growth  improvement  inlcusivity  inclusivity 
june 2015 by robertogreco
Sara Wachter-Boettcher | Personal Histories
"1. Ask only for what I need.
There are lots of reasons companies want data about their customers or users, but a good many of them come down to marketing: How can I gather information that helps me more effectively sell you things?

There’s a difference between nice-to-have and mission-critical information. And too often, we force users to provide things we really don’t need—things they might not even have, or don’t want to tell us.

We talk a lot about being user-centered in the way we design and write. But how often do we assess—truly assess—how much we need from a person for them to use our products or services? How often do we prioritize our dreams of better user data, more accurate profiles, more “personalization”?

2. Work on their clock, not mine.
It wasn’t a problem that the German government asked about my family members—I’m proving my nationality, after all. But it came as a surprise; it threw me somewhere I hadn’t intended to go right then, and it took me a couple minutes to regain my bearings and move on.

Paper doesn’t mind the wait, but websites often do: they make it impossible to start a form and then save it for later. They time out. They’re impatient as all hell.

I suspect it’s because our industry has long prioritized speed: the one-click purchase. The real-time update. The instant download. And speed is helpful quite often—who doesn’t want a page to load as fast as possible?

But speed doesn’t mean the same thing as ease.

Margot Bloomstein has spoken recently about slowing our content roll—about slowing down the pace of our content to help users have a more memorable and successful experience.

What if we looked at ways to optimize interactions not just for speed, but also for flexibility—for a user to be able to complete steps on their own terms? When might it help someone to be able to pause, to save their progress, to skip a question and come back to it at the end?

What would a more forgiving interface look like?

3. Allow for complexity.
I didn’t need to explain my might-have-been older brother’s backstory to the German government. But in my doctor’s form, that complexity mattered to me—and a simple binary wasn’t nearly enough space for me to feel comfortable.

As interface-makers, what might seem simple to us could be anything but to our users. What can we do to allow for that complexity? Which what-ifs have we considered? What spaces do they create?

Take gender. I have qualms about many of Facebook’s practices, but they’ve done this well. Rather than a binary answer, you can now customize your gender however you’d like.

[image]

Facebook's gender selector showing Male, Female, and Custom options
You can also choose how you want to be addressed—as he, she, or they.

[image]

Facebook's pronoun selector showing they as the selected pronoun
We could call users who identify as something other than “male” or “female” an edge case—Why muck up our tidy little form fields and slow down the process to make space for them?

Or we could call them human.

4. Communicate what happens next.
One of my favorite details in Facebook’s gender settings is that little alert message that pops up before you confirm a setting change:
Your preferred pronoun is Public and can be seen by anyone.

I don’t care who knows what my preferred pronoun is. But I’m not a trans teen trying to negotiate the complex public-private spaces of the internet. I’m not afraid of my parents’ or peers’ reactions. I’m lucky.

Whether it’s an immediate announcement to a user’s social circle that they’ve changed their status or a note in their file about sexual assault that every doctor will ask about forever, users deserve to know what happens when they enter information—where it goes, who will see it, and how it will be used.

5. Above all, be kind.
When you approach your site design with a crisis-driven persona, you WILL see things differently.

Eric Meyer

Most of us aren’t living the worst-case scenario most of the time. But everyone is living. And that’s often hard enough.

How would our words change if we were writing for someone in crisis? Would our language soften? Would we ask for less? Would we find simpler words to use, cut those fluffy paragraphs, get to the point sooner? Would we make it easier to contact a human?

Who else might that help?

Humility. Intention. Empathy. Clarity. These concepts are easy enough to understand, but they take work to get right. As writers and strategists and designers, that’s our job. It’s up to us to think through those what-ifs and recognize that, at every single moment—both by what we say and what we do not say—we are making communication choices that affect the way our users feel, the tenor of the conversation we’re having, the answers we’ll get back, and the ways we can use that information.

Most of the choices aren’t inherently wrong or right. The problem is when our intentions are fuzzy, our choices unacknowledged, their implications never examined."
design  interface  inclusion  accessibility  humility  difference  intention  empathy  clarity  communication  purpose  kindness  ux  contentstrategy  gender  content  2015  sarawachter-boettcher  privacy  complexity  binary  inlcusivity  inclusivity 
january 2015 by robertogreco
The Imperfectionist | Sara Wachter-Boettcher | Content Strategy Consulting
"Eventually I forced myself to write a blog post. Then another. And somehow people responded to the things I had been so petrified to write about: How to make content work for mobile, deal with the messy people problems behind most companies’ publishing workflows, and break down decades of document-centric thinking.

I still didn’t have the answers, though. I’d simply become an imperfectionist."
sarawachter-boettcher  impostorsyndrome  impostors  2013  writing  contentstrategy  expertise  workinginpublic  blogging  imperfectionists  imperfection  vulnerability  flaws  honesty  work  howwework 
june 2013 by robertogreco

Copy this bookmark:





to read