recentpopularlog in

robertogreco : thomasedison   7

crap futures — constraint no. 2: legacies of the past
"We are locked into paths determined by decisions or choices made in previous eras, when the world was a much different place. For various reasons these legacies stubbornly persist through time, constraining future possibilities and blinkering us from alternative ways of thinking.

Here, sketched as usual on a napkin over coffee and toast, are some thoughts on legacies of the past that exercise power over our future.

Infrastructure. Take energy, for example. Tesla’s invention of alternating current became the dominant system - rather than Edison’s direct current - essentially because it allowed electricity generated at power stations to be capable of travelling large distances. Tesla’s system has, for the most part, been adopted across the world - an enormous network of stations, cables, pylons, and transformers, with electrical power arriving in our homes through sockets in the wall. This pervasive system dictates or influences almost everything energy related, and in highly complex ways: from the development of new energy generation methods (and figuring out how to feed that energy into the grid) to the design of any electrical product.

Another example is transportation. As Crap Futures has discovered, it is hard to get around this volcanic and vertiginous island without a car. There are no trains, it is too hilly to ride a bike, buses are slow and infrequent, and meanwhile over the past few decades the regional government - one particular government with a 37-year reign - poured millions into building a complex network of roads and tunnels. People used to get to other parts of the island by boat; now (and for the foreseeable future) it is by private car. This is an example of recent infrastructure that a) perpetuated and was dictated by dominant ideas of how transportation infrastructure should be done, and b) will further constrain possibilities for the island into the future.

Laws and insurance. There is a problematic time-slip between the existence of laws and insurance and the real-life behaviour of humans. Laws and insurance are for the most part reactive: insurance policies, for example, are based on amassed data that informs the broker of risk levels, and this system therefore needs history to work. So when you try to insert a new product or concept - a self-driving car or delivery drone - into everyday life, the insurance system pushes back. Insurance companies don’t want to gamble on an unknown future; they want to look at the future through historical data, which is by nature a conservative lens.

Laws, insurance, and historical infrastructure often work together to curb radical change. This partly explains why many of the now technologically realisable dreams of the past, from jetpacks to flying cars, are unlikely to become an everyday reality in that imagined form - more likely they will adapt and conform to existing systems and rules.
"No great idea in its beginning can ever be within the law. How can it be within the law? The law is stationary. The law is fixed. The law is a chariot wheel which binds us all regardless of conditions or place or time." — Emma Goldman, Anarchism and Other Essays (1910)

It is true that laws sometimes outstay their welcome or impede progress. The slow pace at which laws change becomes more and more apparent as the pace of innovation increases. But there are positive as well as negative constraints, and laws often constrain us for good (which of course is their supposed function). At best, they check our impulses, give us a cooling off period, prevent us from tearing everything down at a whim.

So the law can be a force for good. But then of course - good, bad, or ineffectual - there are always those who find ways to circumvent the law. Jonathan Swift wrote: ‘Laws are like cobwebs, which may catch small flies, but let wasps and hornets break through.’ With their shock-and-awe tactics, companies like Uber manage to overcome traditional legal barriers by moving faster than local laws or simply being big enough to shrug off serious legal challenges.

Technology is evolutionary. (See Heilbroner’s quote in the future nudge post.) Comparisons between natural and technological evolution have been a regular phenomenon since as far back Darwin’s On the Origin of Species (1859). Darwin’s revolutionary work inspired philosophers, writers, and anthropologists - Marx and Engels, Samuel Butler, Augustus Pitt-Rivers - to suggest that technological artefacts evolve in a manner similar to natural organisms. This essentially means that technological development is unidirectional, and that radical new possibilities do not happen.

Viewing technology in evolutionary terms would appear to constrain us to only the possibilities that we could reasonably ‘evolve’ into. But this does not have to be the case: natural evolution works by random mutation and natural selection with no ‘plan’ as such, whereas technological innovation and product design are firmly teleologic (literally ‘end-directed’). In other words, the evolutionary model of technological change ignores basic human agency. While natural organisms can’t dip into the historical gene pool to bring back previous mutations, however useful they might be, innovators and designers are not locked into an irreversible evolutionary march and can look backward whenever they choose. So why don’t they? It is a case - circling back to constraint no. 1 - of thinking under the influence of progress dogma."
2015  crapfutures  constraints  darwin  evolution  innovation  future  progress  progressdogma  transportation  infrastructure  law  legal  time  pace  engels  friedrichengels  technology  californianideology  emmagoldman  anarchism  insurance  policy  electricity  nikolatesla  thomasedison  systems  systemsthinking  jonathanswift  samuelbutler  karlmarx  longnow  bighere  augustuspitt-rivers 
january 2016 by robertogreco
Teaching Machines and Turing Machines: The History of the Future of Labor and Learning
"In all things, all tasks, all jobs, women are expected to perform affective labor – caring, listening, smiling, reassuring, comforting, supporting. This work is not valued; often it is unpaid. But affective labor has become a core part of the teaching profession – even though it is, no doubt, “inefficient.” It is what we expect – stereotypically, perhaps – teachers to do. (We can debate, I think, if it’s what we reward professors for doing. We can interrogate too whether all students receive care and support; some get “no excuses,” depending on race and class.)

What happens to affective teaching labor when it runs up against robots, against automation? Even the tasks that education technology purports to now be able to automate – teaching, testing, grading – are shot through with emotion when done by humans, or at least when done by a person who’s supposed to have a caring, supportive relationship with their students. Grading essays isn’t necessarily burdensome because it’s menial, for example; grading essays is burdensome because it is affective labor; it is emotionally and intellectually exhausting.

This is part of our conundrum: teaching labor is affective not simply intellectual. Affective labor is not valued. Intellectual labor is valued in research. At both the K12 and college level, teaching of content is often seen as menial, routine, and as such replaceable by machine. Intelligent machines will soon handle the task of cultivating human intellect, or so we’re told.

Of course, we should ask what happens when we remove care from education – this is a question about labor and learning. What happens to thinking and writing when robots grade students’ essays, for example. What happens when testing is standardized, automated? What happens when the whole educational process is offloaded to the machines – to “intelligent tutoring systems,” “adaptive learning systems,” or whatever the latest description may be? What sorts of signals are we sending students?

And what sorts of signals are the machines gathering in turn? What are they learning to do?
Often, of course, we do not know the answer to those last two questions, as the code and the algorithms in education technologies (most technologies, truth be told) are hidden from us. We are becoming as law professor Frank Pasquale argues a “black box society.” And the irony is hardly lost on me that one of the promises of massive collection of student data under the guise of education technology and learning analytics is to crack open the “black box” of the human brain.

We still know so little about how the brain works, and yet, we’ve adopted a number of metaphors from our understanding of that organ to explain how computers operate: memory, language, intelligence. Of course, our notion of intelligence – its measurability – has its own history, one wrapped up in eugenics and, of course, testing (and teaching) machines. Machines now both frame and are framed by this question of intelligence, with little reflection on the intellectual and ideological baggage that we carry forward and hard-code into them."



"We’re told by some automation proponents that instead of a future of work, we will find ourselves with a future of leisure. Once the robots replace us, we will have immense personal freedom, so they say – the freedom to pursue “unproductive” tasks, the freedom to do nothing at all even, except I imagine, to continue to buy things.
On one hand that means that we must address questions of unemployment. What will we do without work? How will we make ends meet? How will this affect identity, intellectual development?

Yet despite predictions about the end of work, we are all working more. As games theorist Ian Bogost and others have observed, we seem to be in a period of hyper-employment, where we find ourselves not only working numerous jobs, but working all the time on and for technology platforms. There is no escaping email, no escaping social media. Professionally, personally – no matter what you say in your Twitter bio that your Tweets do not represent the opinions of your employer – we are always working. Computers and AI do not (yet) mark the end of work. Indeed, they may mark the opposite: we are overworked by and for machines (for, to be clear, their corporate owners).

Often, we volunteer to do this work. We are not paid for our status updates on Twitter. We are not compensated for our check-in’s in Foursquare. We don’t get kick-backs for leaving a review on Yelp. We don’t get royalties from our photos on Flickr.

We ask our students to do this volunteer labor too. They are not compensated for the data and content that they generate that is used in turn to feed the algorithms that run TurnItIn, Blackboard, Knewton, Pearson, Google, and the like. Free labor fuels our technologies: Forum moderation on Reddit – done by volunteers. Translation of the courses on Coursera and of the videos on Khan Academy – done by volunteers. The content on pretty much every “Web 2.0” platform – done by volunteers.

We are working all the time; we are working for free.

It’s being framed, as of late, as the “gig economy,” the “freelance economy,” the “sharing economy” – but mostly it’s the service economy that now comes with an app and that’s creeping into our personal not just professional lives thanks to billions of dollars in venture capital. Work is still precarious. It is low-prestige. It remains unpaid or underpaid. It is short-term. It is feminized.

We all do affective labor now, cultivating and caring for our networks. We respond to the machines, the latest version of ELIZA, typing and chatting away hoping that someone or something responds, that someone or something cares. It’s a performance of care, disguising what is the extraction of our personal data."



"Personalization. Automation. Management. The algorithms will be crafted, based on our data, ostensibly to suit us individually, more likely to suit power structures in turn that are increasingly opaque.

Programmatically, the world’s interfaces will be crafted for each of us, individually, alone. As such, I fear, we will lose our capacity to experience collectivity and resist together. I do not know what the future of unions looks like – pretty grim, I fear; but I do know that we must enhance collective action in order to resist a future of technological exploitation, dehumanization, and economic precarity. We must fight at the level of infrastructure – political infrastructure, social infrastructure, and yes technical infrastructure.

It isn’t simply that we need to resist “robots taking our jobs,” but we need to challenge the ideologies, the systems that loath collectivity, care, and creativity, and that champion some sort of Randian individual. And I think the three strands at this event – networks, identity, and praxis – can and should be leveraged to precisely those ends.

A future of teaching humans not teaching machines depends on how we respond, how we design a critical ethos for ed-tech, one that recognizes, for example, the very gendered questions at the heart of the Turing Machine’s imagined capabilities, a parlor game that tricks us into believing that machines can actually love, learn, or care."
2015  audreywatters  education  technology  academia  labor  work  emotionallabor  affect  edtech  history  highered  highereducation  teaching  schools  automation  bfskinner  behaviorism  sexism  howweteach  alanturing  turingtest  frankpasquale  eliza  ai  artificialintelligence  robots  sharingeconomy  power  control  economics  exploitation  edwardthorndike  thomasedison  bobdylan  socialmedia  ianbogost  unemployment  employment  freelancing  gigeconomy  serviceeconomy  caring  care  love  loving  learning  praxis  identity  networks  privacy  algorithms  freedom  danagoldstein  adjuncts  unions  herbertsimon  kevinkelly  arthurcclarke  sebastianthrun  ellenlagemann  sidneypressey  matthewyglesias  karelčapek  productivity  efficiency  bots  chatbots  sherryturkle 
august 2015 by robertogreco
Thomas Edison and the Cult of Sleep Deprivation - The Atlantic
"Sleep loss is most common among older workers (ages 30 to 64), and among those who earn little and work multiple jobs. Still, about a quarter of people in the top income quintile report regularly being short on sleep, and sleep deprivation across all income groups has been rising over the years. A group of sleep researchers recently told the BBC that people are now getting one or two hours less shut-eye each night than they did 60 years ago, primarily because of the encroachment of work into downtime and the proliferation of blue-light emitting electronics.

"We are the supremely arrogant species; we feel we can abandon four billion years of evolution and ignore the fact that we have evolved under a light-dark cycle,” Oxford University Professor Russell Foster said. "And long-term, acting against the clock can lead to serious health problems."

These problems include well-documented correlations with heart disease, diabetes, obesity, and accidents. A March study published in the Journal of Neuroscience found that long-term sleep loss was associated with permanent brain damage in rats."
sleep  thomasedison  2014  health  insomnia 
october 2014 by robertogreco
Liz Danzico - Adding By Leaving Out: The Power of the Pause on Vimeo
"We tend to think of the pause as awkward. In speech, pauses connote uncomfortable silence, an issue at hand, and as communicators, we smooth over silence with fillers. We’re trained to deliver smooth speech, censoring “um” and “ah” out. As designers, as much as we value whitespace, we tend to fill it. This distaste for the pause — and the inverse seeking an always-on state — is a daily battle we face. We’re impatient with the pause, and as a result, we’re missing out on a great deal. What would happen if we become more comfortable with the pause? As it turns out, we can add by leaving out. From Edison to Underhill to web-based software, learn where the pause has power."

[Something very brief that I wrote about pause a few months before: http://robertogreco.tumblr.com/post/626105538/hustle-works-best-when-paired-with-pause-time ]
lizdanzico  pause  slow  slowness  design  webdesign  words  comments  collections  whitespace  impatience  patience  behavior  smoothness  wabi-sabi  fluency  speech  speaking  communication  understanding  thomasedison  toshare  classdieas  jonathansafranfoer  awkwardness  webdev 
december 2010 by robertogreco
The Louvre of the Industrial Age - O'Reilly Radar
"Under Marc's eye, we also saw the transformation of the machines from purely functional objects to things of beauty. We saw the advances in engineering - the materials, the workmanship, the design, over a hundred years of innovation. Visiting The Henry Ford, as they call it, is a truly humbling experience. I would never in a hundred years have thought of making a visit to Detroit just to visit this museum, but knowing what I know now, I will tell you confidently that it is as worth your while as a visit to Paris just to see the Louvre, to Rome for the Vatican Museum, to Florence for the Uffizi Gallery, to St. Petersburg for the Hermitage, or to Berlin for the Pergamon Museum. This is truly one of the world's great museums, and the world that it chronicles is our own."
henryford  henryfordmuseum  museums  timoreilly  industrialage  history  pilgrimages  detroit  tosee  thomasedison  lutherburbank 
july 2010 by robertogreco

Copy this bookmark:





to read