recentpopularlog in


« earlier   
No one’s coming. It’s up to us. – Dan Hon – Medium
"Getting from here to there

This is all very well and good. But what can we do? And more precisely, what “we”? There’s increasing acceptance of the reality that the world we live in is intersectional and we all play different and simultaneous roles in our lives. The society of “we” includes technologists who have a chance of affecting the products and services, it includes customers and users, it includes residents and citizens.

I’ve made this case above, but I feel it’s important enough to make again: at a high level, I believe that we need to:

1. Clearly decide what kind of society we want; and then

2. Design and deliver the technologies that forever get us closer to achieving that desired society.

This work is hard and, arguably, will never be completed. It necessarily involves compromise. Attitudes, beliefs and what’s considered just changes over time.

That said, the above are two high level goals, but what can people do right now? What can we do tactically?

What we can do now

I have two questions that I think can be helpful in guiding our present actions, in whatever capacity we might find ourselves.

For all of us: What would it look like, and how might our societies be different, if technology were better aligned to society’s interests?

At the most general level, we are all members of a society, embedded in existing governing structures. It certainly feels like in the recent past, those governing structures are coming under increasing strain, and part of the blame is being laid at the feet of technology.

One of the most important things we can do collectively is to produce clarity and prioritization where we can. Only by being clearer and more intentional about the kind of society we want and accepting what that means, can our societies and their institutions provide guidance and leadership to technology.

These are questions that cannot and should not be left to technologists alone. Advances in technology mean that encryption is a societal issue. Content moderation and censorship are a societal issue. Ultimately, it should be for governments (of the people, by the people) to set expectations and standards at the societal level, not organizations accountable only to a board of directors and shareholders.

But to do this, our governing institutions will need to evolve and improve. It is easier, and faster, for platforms now to react to changing social mores. For example, platforms are responding in reaction to society’s reaction to “AI-generated fake porn” faster than governing and enforcing institutions.

Prioritizations may necessarily involve compromise, too: the world is not so simple, and we are not so lucky, that it can be easily and always divided into A or B, or good or not-good.

Some of my perspective in this area is reflective of the schism American politics is currently experiencing. In a very real way, America, my adoptive country of residence, is having to grapple with revisiting the idea of what America is for. The same is happening in my country of birth with the decision to leave the European Union.

These are fundamental issues. Technologists, as members of society, have a point of view on them. But in the way that post-enlightenment governing institutions were set up to protect against asymmetric distribution of power, technology leaders must recognize that their platforms are now an undeniable, powerful influence on society.

As a society, we must do the work to have a point of view. What does responsible technology look like?

For technologists: How can we be humane and advance the goals of our society?

As technologists, we can be excited about re-inventing approaches from first principles. We must resist that impulse here, because there are things that we can do now, that we can learn now, from other professions, industries and areas to apply to our own. For example:

* We are better and stronger when we are together than when we are apart. If you’re a technologist, consider this question: what are the pros and cons of unionizing? As the product of a linked network, consider the question: what is gained and who gains from preventing humans from linking up in this way?

* Just as we create design patterns that are best practices, there are also those that represent undesired patterns from our society’s point of view known as dark patterns. We should familiarise ourselves with them and each work to understand why and when they’re used and why their usage is contrary to the ideals of our society.

* We can do a better job of advocating for and doing research to better understand the problems we seek to solve, the context in which those problems exist and the impact of those problems. Only through disciplines like research can we discover in the design phase — instead of in production, when our work can affect millions — negative externalities or unintended consequences that we genuinely and unintentionally may have missed.

* We must compassionately accept the reality that our work has real effects, good and bad. We can wish that bad outcomes don’t happen, but bad outcomes will always happen because life is unpredictable. The question is what we do when bad things happen, and whether and how we take responsibility for those results. For example, Twitter’s leadership must make clear what behaviour it considers acceptable, and do the work to be clear and consistent without dodging the issue.

* In America especially, technologists must face the issue of free speech head-on without avoiding its necessary implications. I suggest that one of the problems culturally American technology companies (i.e., companies that seek to emulate American culture) face can be explained in software terms. To use agile user story terminology, the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end, and accepting that free speech is a means involves the hard work of considering and taking a clear, understandable position as to what ends.

* We have been warned. Academics — in particular, sociologists, philosophers, historians, psychologists and anthropologists — have been warning of issues such as large-scale societal effects for years. Those warnings have, bluntly, been ignored. In the worst cases, those same academics have been accused of not helping to solve the problem. Moving on from the past, is there not something that we technologists can learn? My intuition is that post the 2016 American election, middle-class technologists are now afraid. We’re all in this together. Academics are reaching out, have been reaching out. We have nothing to lose but our own shame.

* Repeat to ourselves: some problems don’t have fully technological solutions. Some problems can’t just be solved by changing infrastructure. Who else might help with a problem? What other approaches might be needed as well?

There’s no one coming. It’s up to us.

My final point is this: no one will tell us or give us permission to do these things. There is no higher organizing power working to put systemic changes in place. There is no top-down way of nudging the arc of technology toward one better aligned with humanity.

It starts with all of us.


I’ve been working on the bigger themes behind this talk since …, and an invitation to 2017’s Foo Camp was a good opportunity to try to clarify and improve my thinking so that it could fit into a five minute lightning talk. It also helped that Foo Camp has the kind of (small, hand-picked — again, for good and ill) influential audience who would be a good litmus test for the quality of my argument, and would be instrumental in taking on and spreading the ideas.

In the end, though, I nearly didn’t do this talk at all.

Around 6:15pm on Saturday night, just over an hour before the lightning talks were due to start, after the unconference’s sessions had finished and just before dinner, I burst into tears talking to a friend.

While I won’t break the societal convention of confidentiality that helps an event like Foo Camp be productive, I’ll share this: the world felt too broken.

Specifically, the world felt broken like this: I had the benefit of growing up as a middle-class educated individual (albeit, not white) who believed he could trust that institutions were a) capable and b) would do the right thing. I now live in a country where a) the capability of those institutions has consistently eroded over time, and b) those institutions are now being systematically dismantled, to add insult to injury.

In other words, I was left with the feeling that there’s nothing left but ourselves.

Do you want the poisonous lead removed from your water supply? Your best bet is to try to do it yourself.

Do you want a better school for your children? Your best bet is to start it.

Do you want a policing policy that genuinely rehabilitates rather than punishes? Your best bet is to…

And it’s just. Too. Much.

Over the course of the next few days, I managed to turn my outlook around.

The answer, of course, is that it is too much for one person.

But it isn’t too much for all of us."
danhon  technology  2018  2017  johnperrybarlow  ethics  society  calltoaction  politics  policy  purpose  economics  inequality  internet  web  online  computers  computing  future  design  debchachra  ingridburrington  fredscharmen  maciejceglowski  timcarmody  rachelcoldicutt  stacy-marieishmael  sarahjeong  alexismadrigal  ericmeyer  timmaughan  mimionuoha  jayowens  jayspringett  stacktivism  georginavoss  damienwilliams  rickwebb  sarawachter-boettcher  jamebridle  adamgreenfield  foocamp  timoreilly  kaitlyntiffany  fredturner  tomcarden  blainecook  warrenellis  danhill  cydharrell  jenpahljka  robinray  noraryan  mattwebb  mattjones  danachisnell  heathercamp  farrahbostic  negativeexternalities  collectivism  zeyneptufekci  maciejcegłowski 
12 days ago by robertogreco
s5e06: Filtered for wide spread, low accuracy
Which comes back to the idea that sufficiently non-transparent advanced algorithms are indistinguishable from malice.
danhon  státusz  facebook  internet 
5 weeks ago by kelt
s4e10: Stages of Transformation
Dan Hon: "It doesn't matter *how* we got here, all that matters is that we're *here*. For those people who were involved in getting to our current location - maybe they had written policy, maybe they had been involved with the design or management of certain processes - this isn't a judgment on them or their work. We just have to accept that we're here now, without blame or judgment."
*  transformation  government  danhon  yes 
april 2017 by gilest
Dan Hon s3e30: The Generative Surplus
Dan on neural networks and generative stuff. Lots of good thinks in here.
danhon  neural_networks  creativity  generative 
october 2016 by peteashton
s3e27: It's Difficult
On tech companies saying some things are difficult when they do difficult things all the time and pride themselves on it.

"And lack of trying looks like lack of caring"
facebook  danhon  silicon-valley-groupthink 
september 2016 by mr_stru
s3e22: Things That Have Caught My Attention
Friday, June 17 2016, 10.3km up in the sky with a ground speed of 798 km/h, listening to Cigarettes After Sex's cover of Keep on Loving You. I'm somewhere in the air between Atlanta, Georgia, on the way home to Portland.
readinpocket  danhon 
june 2016 by patrick
s3e21: That Is Interesting, Isn't It?
Still going with the Contact quotes, although potentially getting a bit more obscure.
readinpocket  danhon 
june 2016 by patrick
I'm actually (shh) writing this on Monday 2 May 2016, which is the same day that I sent previous episode, s3e19 (I know, the previouslies and continuity on this are getting a bit out of hand).
readinpocket  danhon 
may 2016 by patrick
s3e19: The usual: nouns, adverbs, adjective here and there
Friday, 29th April, 2016. It's a grey, overcast day in Portland today, but the weekend promises sun and unseasonable temperatures, so, in the words of some internet meme "we've got that going for us".
readinpocket  danhon 
may 2016 by patrick
s3e18: You Want To Classify Prime Numbers Now?
Wednesday, April 27th 2016 at the XOXO Outpost in Portland.
readinpocket  danhon 
may 2016 by patrick
s3e17: Funny, I've always believed that the world is what we make of it
Tuesday 26th April, 2016 at the XOXO Outpost after having taken an hour out of the morning to go to depressingly middle-class-white-stereotypical Music Together (Hello, everybody!) class with wife and son where today's dance party was Prince's 1999.
readinpocket  danhon 
april 2016 by patrick
s3e15: First Rule In Government Spending
10:56am on Thursday, 21st April 2016 sat outside a cafe in shorts and a t-shirt because it's a pleasant 64 degrees fahrenheit / 18 degrees celsius and the Brit in me sees the sun and is completely freaking out. I am not melting, not like the last few days when it was above 80f/26c.
readinpocket  danhon 
april 2016 by patrick
s3e14: OK TO GO
1:12pm on Tuesday, April 19th 2016 at the XOXO Outpost in Portland on yet another unseasonably warm April day.
readinpocket  danhon 
april 2016 by patrick
s3e13: They Should Have Sent A Poet
1:37pm on Monday, 18 April 2016 at the XOXO Outpost in Portland again. It's 82 degrees fahrenheit, 27.7 degrees celsius outside, an unreasonably warm mid-April, which means breaking out the shorts, t-shirts and baseball caps all over again.
readinpocket  danhon 
april 2016 by patrick
s3e16: If It's Just Us... Seems Like An Awful Waste Of Space, Right?
1:33pm on Monday, April 25th, 2016 at the XOXO Outpost in Portland. Disclosure: I haven't seen Lemonade yet, which I should probably be watching instead of writing this. Lots of bits and bobs today.
readinpocket  danhon 
april 2016 by patrick

Copy this bookmark:

to read