recentpopularlog in

robertogreco : georginavoss   6

Uses This / Georgina Voss
"What I do - gestures expansively - is research-intensive projects (writing [essays, journalism], performance, installation, sculpture) about the politics of large-scale complex technological and industrial systems; and teaching about the same.

I'm co-founder and lead/director of two studios: Supra Systems Studio, based at the London College of Communication's Design School, University of the Arts London, where I'm a senior lecturer; and Strange Telemetry, in residence at Somerset House Studios. My PhD is in the anthropology of deviance, and industrial economics."



"Clue is the single best software tool I can think of, tying together my messy sense of time with the realities of my physical form; and was also the thing that made me realise that what I'd worried was an ongoing glandular fever relapse was actually pre-menstrual exhaustion. Thanks, Clue!"



"What would be your dream setup?

Universal healthcare and education, open borders, an alternative internet, better battery life. A gigantic warehouse big enough to do enormous work in; a huge city; also, a forest."
georginavoss  usesthis  thesetup  2018  education  healthcare  tools  software  hardware  anthropology  technology  deviance  bodies  time  body 
june 2018 by robertogreco
No one’s coming. It’s up to us. – Dan Hon – Medium
"Getting from here to there

This is all very well and good. But what can we do? And more precisely, what “we”? There’s increasing acceptance of the reality that the world we live in is intersectional and we all play different and simultaneous roles in our lives. The society of “we” includes technologists who have a chance of affecting the products and services, it includes customers and users, it includes residents and citizens.

I’ve made this case above, but I feel it’s important enough to make again: at a high level, I believe that we need to:

1. Clearly decide what kind of society we want; and then

2. Design and deliver the technologies that forever get us closer to achieving that desired society.

This work is hard and, arguably, will never be completed. It necessarily involves compromise. Attitudes, beliefs and what’s considered just changes over time.

That said, the above are two high level goals, but what can people do right now? What can we do tactically?

What we can do now

I have two questions that I think can be helpful in guiding our present actions, in whatever capacity we might find ourselves.

For all of us: What would it look like, and how might our societies be different, if technology were better aligned to society’s interests?

At the most general level, we are all members of a society, embedded in existing governing structures. It certainly feels like in the recent past, those governing structures are coming under increasing strain, and part of the blame is being laid at the feet of technology.

One of the most important things we can do collectively is to produce clarity and prioritization where we can. Only by being clearer and more intentional about the kind of society we want and accepting what that means, can our societies and their institutions provide guidance and leadership to technology.

These are questions that cannot and should not be left to technologists alone. Advances in technology mean that encryption is a societal issue. Content moderation and censorship are a societal issue. Ultimately, it should be for governments (of the people, by the people) to set expectations and standards at the societal level, not organizations accountable only to a board of directors and shareholders.

But to do this, our governing institutions will need to evolve and improve. It is easier, and faster, for platforms now to react to changing social mores. For example, platforms are responding in reaction to society’s reaction to “AI-generated fake porn” faster than governing and enforcing institutions.

Prioritizations may necessarily involve compromise, too: the world is not so simple, and we are not so lucky, that it can be easily and always divided into A or B, or good or not-good.

Some of my perspective in this area is reflective of the schism American politics is currently experiencing. In a very real way, America, my adoptive country of residence, is having to grapple with revisiting the idea of what America is for. The same is happening in my country of birth with the decision to leave the European Union.

These are fundamental issues. Technologists, as members of society, have a point of view on them. But in the way that post-enlightenment governing institutions were set up to protect against asymmetric distribution of power, technology leaders must recognize that their platforms are now an undeniable, powerful influence on society.

As a society, we must do the work to have a point of view. What does responsible technology look like?

For technologists: How can we be humane and advance the goals of our society?

As technologists, we can be excited about re-inventing approaches from first principles. We must resist that impulse here, because there are things that we can do now, that we can learn now, from other professions, industries and areas to apply to our own. For example:

* We are better and stronger when we are together than when we are apart. If you’re a technologist, consider this question: what are the pros and cons of unionizing? As the product of a linked network, consider the question: what is gained and who gains from preventing humans from linking up in this way?

* Just as we create design patterns that are best practices, there are also those that represent undesired patterns from our society’s point of view known as dark patterns. We should familiarise ourselves with them and each work to understand why and when they’re used and why their usage is contrary to the ideals of our society.

* We can do a better job of advocating for and doing research to better understand the problems we seek to solve, the context in which those problems exist and the impact of those problems. Only through disciplines like research can we discover in the design phase — instead of in production, when our work can affect millions — negative externalities or unintended consequences that we genuinely and unintentionally may have missed.

* We must compassionately accept the reality that our work has real effects, good and bad. We can wish that bad outcomes don’t happen, but bad outcomes will always happen because life is unpredictable. The question is what we do when bad things happen, and whether and how we take responsibility for those results. For example, Twitter’s leadership must make clear what behaviour it considers acceptable, and do the work to be clear and consistent without dodging the issue.

* In America especially, technologists must face the issue of free speech head-on without avoiding its necessary implications. I suggest that one of the problems culturally American technology companies (i.e., companies that seek to emulate American culture) face can be explained in software terms. To use agile user story terminology, the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end, and accepting that free speech is a means involves the hard work of considering and taking a clear, understandable position as to what ends.

* We have been warned. Academics — in particular, sociologists, philosophers, historians, psychologists and anthropologists — have been warning of issues such as large-scale societal effects for years. Those warnings have, bluntly, been ignored. In the worst cases, those same academics have been accused of not helping to solve the problem. Moving on from the past, is there not something that we technologists can learn? My intuition is that post the 2016 American election, middle-class technologists are now afraid. We’re all in this together. Academics are reaching out, have been reaching out. We have nothing to lose but our own shame.

* Repeat to ourselves: some problems don’t have fully technological solutions. Some problems can’t just be solved by changing infrastructure. Who else might help with a problem? What other approaches might be needed as well?

There’s no one coming. It’s up to us.

My final point is this: no one will tell us or give us permission to do these things. There is no higher organizing power working to put systemic changes in place. There is no top-down way of nudging the arc of technology toward one better aligned with humanity.

It starts with all of us.

Afterword

I’ve been working on the bigger themes behind this talk since …, and an invitation to 2017’s Foo Camp was a good opportunity to try to clarify and improve my thinking so that it could fit into a five minute lightning talk. It also helped that Foo Camp has the kind of (small, hand-picked — again, for good and ill) influential audience who would be a good litmus test for the quality of my argument, and would be instrumental in taking on and spreading the ideas.

In the end, though, I nearly didn’t do this talk at all.

Around 6:15pm on Saturday night, just over an hour before the lightning talks were due to start, after the unconference’s sessions had finished and just before dinner, I burst into tears talking to a friend.

While I won’t break the societal convention of confidentiality that helps an event like Foo Camp be productive, I’ll share this: the world felt too broken.

Specifically, the world felt broken like this: I had the benefit of growing up as a middle-class educated individual (albeit, not white) who believed he could trust that institutions were a) capable and b) would do the right thing. I now live in a country where a) the capability of those institutions has consistently eroded over time, and b) those institutions are now being systematically dismantled, to add insult to injury.

In other words, I was left with the feeling that there’s nothing left but ourselves.

Do you want the poisonous lead removed from your water supply? Your best bet is to try to do it yourself.

Do you want a better school for your children? Your best bet is to start it.

Do you want a policing policy that genuinely rehabilitates rather than punishes? Your best bet is to…

And it’s just. Too. Much.

Over the course of the next few days, I managed to turn my outlook around.

The answer, of course, is that it is too much for one person.

But it isn’t too much for all of us."
danhon  technology  2018  2017  johnperrybarlow  ethics  society  calltoaction  politics  policy  purpose  economics  inequality  internet  web  online  computers  computing  future  design  debchachra  ingridburrington  fredscharmen  maciejceglowski  timcarmody  rachelcoldicutt  stacy-marieishmael  sarahjeong  alexismadrigal  ericmeyer  timmaughan  mimionuoha  jayowens  jayspringett  stacktivism  georginavoss  damienwilliams  rickwebb  sarawachter-boettcher  jamebridle  adamgreenfield  foocamp  timoreilly  kaitlyntiffany  fredturner  tomcarden  blainecook  warrenellis  danhill  cydharrell  jenpahljka  robinray  noraryan  mattwebb  mattjones  danachisnell  heathercamp  farrahbostic  negativeexternalities  collectivism  zeyneptufekci  maciejcegłowski 
february 2018 by robertogreco
Our Work Here Is Done: Visions of a Robot Economy [.pdf]
"The essays in this volume address a number of possibilities for how the proceeds of a robot revolution might be redistributed. Notably, Noah Smith’s piece argues for a universal basic income for everyone, paid for from the proceeds of robot–enhanced productivity.

What is clear is that if automation necessitates a big shift in how we tax, it offers an opportunity to start taxing more sensible things. Economists have long argued for taxing land, carbon emissions and other bads, rather than taxing work. If there is less work about in the future, this may be the chance to make a change.

There is also the question of how we share out the rewards of a robot economy. We may not yet be ready for a universal basic income, since at least for the time being so many people’s conception of (their own and others’) value to society is bound up in work. But it is surely worth making policies to encourage ownership of robots is widely dispersed. The simplest way to make sure everyone has a stake in robots is to encourage widespread pension ownership – so that people own shares in the companies that own the robots.

But if the riches of automation are really as abundant as some people think they are, we could go further, and learn a lesson from the few countries that have dealt well with natural resource riches, like Norway and Alaska, by establishing a national endowment to hold wealth on behalf of citizens. The proceeds of this could be used to pay an annual dividend to citizens (as in Alaska) or to invest in future productivity (as has been proposed in Norway)."
universalbasicincome  labor  robot  income  taxation  taxes  economics  2014  nesta  change  jsutice  future  competition  cooperation  ryanavent  noahsmith  francescoppola  alanwinfield  nickhawes  ertruitt  jonturney  izabellakaminska  georginavoss  machines  slavery  edwardskidelsky  frederickguy  tessreidy  steverandywaldman  machineage  power  wages  ubi 
june 2014 by robertogreco
Lighthouse: IMPROVING REALITY 2013 - FILMS
"HOW ARE ARTISTS, TECHNOLOGISTS & WRITERS SUBVERTING OUR NOTION OF REALITY?

Lighthouse's digital culture conference, Improving Reality, returned for a third year this September. Talks included tours through worlds that artists are growing rather than making, critical revelations of the systems and infrastructures that shape our world, and narratives of radical alternative futures.

We’ve collected together the videos of the days talks, and invite you to join us in the discussion on Twitter and Facebook, or in any way you’d like. Visit the relevant session to watch the videos, and find out more about the themes, issues and ideas up or discussion.

In between sessions were a set of Tiny Talks, interventions from artists and designers involved in Brighton Digital Festival.

Session 1. Revealing Reality
http://lighthouse.org.uk/programme/improving-reality-2013-films-session-one

Social, political and technological infrastructures are the invisible “dark matter” which underlies contemporary life, influencing our environment and behaviour. This session explores how the spaces where we live, such as our cities, are being transformed by increasingly interlinked technological and architectural infrastructures. We will see how artists and designers are making these infrastructures visible, so that we may better understand and critique them.

Speakers: Timo Arnall, Keller Easterling and Frank Swain. Chair: Honor Harger.


Session 2. Re-imagining Reality
http://lighthouse.org.uk/programme/improving-reality-2013-films-session-two

Our increasingly technologised world, with its attendant infrastructures, is in a constant state of flux. This session explores how artists, designers and writers are imagining how our infrastructures may evolve. We will learn what writers might reveal about our infrastructures, using tools such as design fiction. We will go on tours through worlds that artists are growing, rather than making, using new materials like synthetic biology and nanotechnology. And we’ll see how artists are imagining new realities using techniques from futurism and foresight.

Speakers: Paul Graham Raven, Maja Kuzmanovic, Tobias Revell and Alexandra Daisy Ginsberg. Chair: Simon Ings.


Session 3. Reality Check
http://lighthouse.org.uk/programme/improving-reality-2013-films-session-three

The growing reach of technological infrastructures and engineered systems into our lives creates uneasy social and ethical challenges. The recent scandals relating to the NSA, the revelation of the PRISM surveillance programme, and the treatment of whistleblowers such as Edward Snowden and Bradley Manning, have revealed how fundamentally intertwined our civil liberties are with our technological infrastructures. These systems can both enable, and threaten, both our privacy and our security. Ubiquitous networked infrastructures create radical new creative opportunities for a coming generation of makers and users, whilst also presenting us with major social dilemmas. In this session we will look at the social and ethical questions which will shape our technological infrastructures in the future. We will examine algorithmic infrastructures, power dynamics, and ask, “whose reality we are trying to improve”.

Speakers: Farida Vis, Georgina Voss, Paula Le Dieu, and Justin Pickard. Chair: Scott Smith."
timoarnall  kellereasterling  frankswain  honorharger  paulgrahamraven  majakuzmanovic  tobiasrevell  alexandradaisy-ginsberg  simonings  faridavis  georginavoss  paulaledieu  justinpickard  scottsmitt  reality  art  systems  infrastructure  politics  technology  darkmatter  behavior  environment  architecture  2013  flux  change  nanotechnology  syntheticbiology  materials  futurism  ethics  surveillance  nsa  edwardsnowden  bradleymanning  civilliberties  security  privacy  algorithms  networks  ubiquitouscomputing  powerdynamics  towatch 
october 2013 by robertogreco
Algorithmic Rape Jokes in the Library of Babel | Quiet Babylon
"Jorge Luis Borges’ Library of Babel twisted through the logic of SEO and commerce."

"Part of what tips the algorithmic rape joke t-shirts over from very offensive to shockingly offensive is that they are ostensibly physical products. Intuitions are not yet tuned for spambot clothes sellers."

"Amazon isn’t a store, not really. Not in any sense that we can regularly think about stores. It’s a strange pulsing network of potential goods, global supply chains, and alien associative algorithms with the skin of a store stretched over it, so we don’t lose our minds."
algorithms  amazon  culture  internet  borges  timmaly  2013  jamesbridle  apologies  non-apologies  brianeno  generative  crapjects  georginavoss  rape  peteashton  software  taste  poortaste  deniability  secondlife  solidgoldbomb  t-shirts  keepcalmand  spam  objects  objectspam  quinnnorton  masscustomization  rapidprototyping  shapersubcultures  scale  libraryofbabel  thelibraryofbabel  tshirts 
march 2013 by robertogreco

Copy this bookmark:





to read