recentpopularlog in

robertogreco : teachingmachines   3

The Stories We Were Told about Education Technology (2018)
"It’s been quite a year for education news, not that you’d know that by listening to much of the ed-tech industry (press). Subsidized by the Chan Zuckerberg Initiative, some publications have repeatedly run overtly and covertly sponsored articles that hawk the future of learning as “personalized,” as focused on “the whole child.” Some of these attempt to stretch a contemporary high-tech vision of social emotional surveillance so it can map onto a strange vision of progressive education, overlooking no doubt how the history of progressive education has so often been intertwined with race science and eugenics.

Meanwhile this year, immigrant, refugee children at the United States border were separated from their parents and kept in cages, deprived of legal counsel, deprived of access to education, deprived in some cases of water.

“Whole child” and cages – it’s hardly the only jarring juxtaposition I could point to.

2018 was another year of #MeToo, when revelations about sexual assault and sexual harassment shook almost every section of society – the media and the tech industries, unsurprisingly, but the education sector as well – higher ed, K–12, and non-profits alike, as well school sports all saw major and devastating reports about cultures and patterns of sexual violence. These behaviors were, once again, part of the hearings and debates about a Supreme Court Justice nominee – a sickening deja vu not only for those of us that remember Anita Hill ’s testimony decades ago but for those of us who have experienced something similar at the hands of powerful people. And on and on and on.

And yet the education/technology industry (press) kept up with its rosy repetition that social equality is surely its priority, a product feature even – that VR, for example, a technology it has for so long promised is “on the horizon,” is poised to help everyone, particularly teachers and students, become more empathetic. Meanwhile, the founder of Oculus Rift is now selling surveillance technology for a virtual border wall between the US and Mexico.

2018 was a year in which public school teachers all over the US rose up in protest over pay, working conditions, and funding, striking in red states like West Virginia, Kentucky, and Oklahoma despite an anti-union ruling by the Supreme Court.

And yet the education/technology industry (press) was wowed by teacher influencers and teacher PD on Instagram, touting the promise for more income via a side-hustle like tutoring rather by structural or institutional agitation. Don’t worry, teachers. Robots won’t replace you, the press repeatedly said. Unsaid: robots will just de-professionalize, outsource, or privatize the work. Or, as the AI makers like to say, robots will make us all work harder (and no doubt, with no unions, cheaper).

2018 was a year of ongoing and increased hate speech and bullying – racism and anti-Semitism – on campuses and online.

And yet the education/technology industry (press) still maintained that blockchain would surely revolutionize the transcript and help insure that no one lies about who they are or what they know. Blockchain would enhance “smart spending” and teach financial literacy, the ed-tech industry (press) insisted, never once mentioning the deep entanglements between anti-Semitism and the alt-right and blockchain (specifically Bitcoin) backers.

2018 was a year in which hate and misinformation, magnified and spread by technology giants, continued to plague the world. Their algorithmic recommendation engines peddled conspiracy theories (to kids, to teens, to adults). “YouTube, the Great Radicalizer” as sociologist Zeynep Tufekci put it in a NYT op-ed.

And yet the education/technology industry (press) still talked about YouTube as the future of education, cheerfully highlighting (that is, spreading) its viral bullshit. Folks still retyped the press releases Google issued and retyped the press releases Facebook issued, lauding these companies’ (and their founders’) efforts to reshape the curriculum and reshape the classroom.

This is the ninth year that I’ve reviewed the stories we’re being told about education technology. Typically, this has been a ten (or more) part series. But I just can’t do it any more. Some people think it’s hilarious that I’m ed-tech’s Cassandra, but it’s not funny at all. It’s depressing, and it’s painful. And no one fucking listens.

If I look back at what I’ve written in previous years, I feel like I’ve already covered everything I could say about 2018. Hell, I’ve already written about the whole notion of the “zombie idea” in ed-tech – that bad ideas never seem to go away, that just get rebranded and repackaged. I’ve written about misinformation and ed-tech (and ed-tech as misinformation). I’ve written about the innovation gospel that makes people pitch dangerously bad ideas like “Uber for education” or “Alexa for babysitting.” I’ve written about the tech industry’s attempts to reshape the school system as its personal job training provider. I’ve written about the promise to “rethink the transcript” and to “revolutionize credentialing.” I’ve written about outsourcing and online education. I’ve written about coding bootcamps as the “new” for-profit higher ed, with all the exploitation that entails. I’ve written about the dangers of data collection and data analysis, about the loss of privacy and the lack of security.

And yet here we are, with Mark Zuckerberg – education philanthropist and investor – blinking before Congress, promising that AI will fix everything, while the biased algorithms keep churning out bias, while the education/technology industry (press) continues to be so blinded by “disruption” it doesn’t notice (or care) what’s happened to desegregation, and with so many data breaches and privacy gaffes that they barely make headlines anymore.

Folks. I’m done.

I’m also writing a book, and frankly that’s where my time and energy is going.

There is some delicious irony, I suppose, in the fact that there isn’t much that’s interesting or “innovative” to talk about in ed-tech, particularly since industry folks want to sell us on the story that tech is moving faster than it’s ever moved before, so fast in fact that the ol’ factory model school system simply cannot keep up.

I’ve always considered these year-in-review articles to be mini-histories of sorts – history of the very, very recent past. Now, instead, I plan to spend my time taking a longer, deeper look at the history of education technology, with particular attention for the next few months, as the title of my book suggests, to teaching machines – to the promises that machines will augment, automate, standardize, and individualize instruction. My focus is on the teaching machines of the mid-twentieth century, but clearly there are echoes – echoes of behaviorism and personalization, namely – still today.

In his 1954 book La Technique (published in English a decade later as The Technological Society), the sociologist Jacques Ellul observes how education had become oriented towards creating technicians, less interested in intellectual development than in personality development – a new “psychopedagogy” that he links to Maria Montessori. “The human brain must be made to conform to the much more advanced brain of the machine,” Ellul writes. “And education will no longer be an unpredictable and exciting adventure in human enlightenment , but an exercise in conformity and apprenticeship to whatever gadgetry is useful in a technical world.” I believe today we call this "social emotional learning" and once again (and so insistently by the ed-tech press and its billionaire backers), Montessori’s name is invoked as the key to preparing students for their place in the technological society.

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

This is the only good grit:

[image of Gritty]

If I were writing a lengthier series on the year in ed-tech, I’d spend much more time talking about the promises made about personalization and social emotional learning. I’ll just note here that the most important “innovator” in this area this year (other than Gritty) was surely the e-cigarette maker Juul, which offered a mindfulness curriculum to schools – offered them the curriculum and $20,000, that is – to talk about vaping. “‘The message: Our thoughts are powerful and can set action in motion,’ the lesson plan states.”

The most important event in ed-tech this year might have occurred on February 14, when a gunman opened fire on his former classmates at Marjory Stone Douglas High School in Parkland, Florida, killing 17 students and staff and injuring 17 others. (I chose this particular school shooting because of the student activism it unleashed.)

Oh, I know, I know – school shootings and school security aren’t ed-tech, ed-tech evangelists have long tried to insist, an argument I’ve heard far too often. But this year – the worst year on record for school shootings (according to some calculations) – I think that argument started to shift a bit. Perhaps because there’s clearly a lot of money to be made in selling schools “security” products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, … [more]
audreywatters  education  technology  edtech  2018  surveillance  privacy  personalization  progressive  schools  quantification  gamification  wholechild  montessori  mariamontessori  eugenics  psychology  siliconvalley  history  venturecapital  highereducation  highered  guns  gunviolence  children  youth  teens  shootings  money  influence  policy  politics  society  economics  capitalism  mindfulness  juul  marketing  gritty  innovation  genetics  psychotechnologies  gender  race  racism  sexism  research  socialemotional  psychopedagogy  pedagogy  teaching  howweteach  learning  howwelearn  teachingmachines  nonprofits  nonprofit  media  journalism  access  donaldtrump  bias  algorithms  facebook  amazon  disruption  data  bigdata  security  jacquesellul  sociology  activism  sel  socialemotionallearning 
december 2018 by robertogreco
The Future of Education: Programmed or Programmable
"See I don’t want to overreach here and make an argument that the Web is some sort of technological or ed-tech utopia. Despite all the talk about “leveling the playing field” and disrupting old, powerful institutions, the Web replicates many pre-existing inequalities; it exacerbates others; it creates new ones. I think we have to work much harder to make the Web live up to the rhetoric of freedom and equality. That’s a political effort, not simply a technological one.

Let me repeat that, because it has pretty significant implications for ed-tech, which is so often developed and implemented at the whims of political decisions — decisions made by politicians, administrators, decisions influenced by budgets, vendor pitches, and the latest Thomas Friedman New York Times op-ed. Decisions like ending Pell Grants for prisoners, for example.

To transform education and education technology to make it "future-facing” means we do have to address what exactly we think education should look like now and in the future. Do we want programmed instruction? Do we want teaching machines? Do we want videotaped lectures? Do we want content delivery systems? Or do we want education that is more student-centered, more networked-focused. Are we ready to move beyond “content” and even beyond “competencies”? Can we address the ed-tech practices that look more and more like carceral education — surveillance, predictive policing, control?"

See, these are political questions and they are philosophical questions. I don’t think it’s quite as simple as a choice between programmed instruction or the programmable web. And instead of acting as though ed-tech is free of ideology, we need to recognize that it is very much enmeshed in it.
audreywatters  2014  content  contentdelivery  edtech  technology  future  education  adomainofone'sown  politics  policy  democracy  surveillance  ideology  edreform  bfskinner  inequality  freedom  equality  teachingmachines 
november 2014 by robertogreco
Ed-Tech's Monsters #ALTC
[video here: ]

"No doubt, we have witnessed in the last few years an explosion in the ed-tech industry and a growing, a renewed interest in ed-tech. Those here at ALT-C know that ed-tech is not new by any means; but there is this sense from many of its newest proponents (particularly in the States) that ed-tech has no history; there is only now and the future.

Ed-tech now, particularly that which is intertwined with venture capital, is boosted by a powerful forms of storytelling: a disruptive innovation mythology, entrepreneurs' hagiography, design fiction, fantasy.

A fantasy that wants to extend its reach into the material world.

Society has been handed a map, if you will, by the technology industry in which we are shown how these brave ed-tech explorers have and will conquer and carve up virtual and physical space.


We are warned of the dragons in dangerous places, the unexplored places, the over explored places, the stagnant, the lands of outmoded ideas — all the places where we should no longer venture. 

Hic Sunt Dracones. There be dragons.

Instead, I’d argue, we need to face our dragons. We need to face our monsters. We need to face the giants. They aren’t simply on the margins; they are, in many ways, central to the narrative."

"I’m in the middle of writing a book called Teaching Machines, a cultural history of the science and politics of ed-tech. An anthropology of ed-tech even, a book that looks at knowledge and power and practices, learning and politics and pedagogy. My book explores the push for efficiency and automation in education: “intelligent tutoring systems,” “artificially intelligent textbooks,” “robo-graders,” and “robo-readers.”

This involves, of course, a nod to “the father of computer science” Alan Turing, who worked at Bletchley Park of course, and his profoundly significant question “Can a machine think?”

I want to ask in turn, “Can a machine teach?”

Then too: What will happen to humans when (if) machines do “think"? What will happen to humans when (if) machines “teach”? What will happen to labor and what happens to learning?

And, what exactly do we mean by those verbs, “think” and “teach”? When we see signs of thinking or teaching in machines, what does that really signal? Is it that our machines are becoming more “intelligent,” more human? Or is it that humans are becoming more mechanical?

Rather than speculate about the future, I want to talk a bit about the past."

"To oppose technology or to fear automation, some like The Economist or venture capitalist Marc Andreessen argue, is to misunderstand how the economy works. (I’d suggest perhaps Luddites understand how the economy works quite well, thank you very much, particularly when it comes to questions of “who owns the machinery” we now must work on. And yes, the economy works well for Marc Andreessen, that’s for sure.)"

"But even without machines, Frankenstein is still read as a cautionary tale about science and about technology; and Shelley’s story has left an indelible impression on us. Its references are scattered throughout popular culture and popular discourse. We frequently use part of the title — “Franken” — to invoke a frightening image of scientific experimentation gone wrong. Frankenfood. Frankenfish. The monster, a monstrosity — a technological crime against nature.

It is telling, very telling, that we often confuse the scientist, Victor Frankenstein, with his creation. We often call the monster Frankenstein.

As the sociologist Bruno Latour has argued, we don’t merely mistake the identity of Frankenstein; we also mistake his crime. It "was not that he invented a creature through some combination of hubris and high technology,” writes Latour, "but rather that he abandoned the creature to itself.”

The creature — again, a giant — insists in the novel that he was not born a monster, but he became monstrous after Frankenstein fled the laboratory in horror when the creature opened his “dull yellow eye,” breathed hard, and convulsed to life.

"Remember that I am thy creature,” he says when he confronts Frankenstein, "I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed. Everywhere I see bliss, from which I alone am irrevocably excluded. I was benevolent and good— misery made me a fiend.”

As Latour observes, "Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.”

Our “gigantic sin”: we failed to love and care for our technological creations. We must love and educate our children. We must love and care for our machines, lest they become monsters.

Indeed, Frankenstein is also a novel about education. The novel is structured as a series of narratives — Captain Watson’s story — a letter he sends to his sister as he explores the Arctic— which then tells Victor Frankenstein’s story through which we hear the creature tell his own story, along with that of the De Lacey family and the arrival of Safie, “the lovely Arabian." All of these are stories about education: some self-directed learning, some through formal schooling.

While typically Frankenstein is interpreted as a condemnation of science gone awry, the novel can also be read as a condemnation of education gone awry. The novel highlights the dangerous consequences of scientific knowledge, sure, but it also explores how knowledge — gained inadvertently, perhaps, gained surreptitiously, gained without guidance — might be disastrous. Victor Frankenstein, stumbling across the alchemists and then having their work dismissed outright by his father, stoking his curiosity. The creature, learning to speak by watching the De Lacey family, learning to read by watching Safie do the same, his finding and reading Volney's Ruins of Empires and Milton’s Paradise Lost."

"To be clear, my nod to the Luddites or to Frankenstein isn’t about rejecting technology; but it is about rejecting exploitation. It is about rejecting an uncritical and unexamined belief in progress. The problem isn’t that science gives us monsters, it's that we have pretended like it is truth and divorced from responsibility, from love, from politics, from care. The problem isn’t that science gives us monsters, it’s that it does not, despite its insistence, give us “the answer."

And that is problem with ed-tech’s monsters. That is the problem with teaching machines.

In order to automate education, must we see knowledge in a certain way, as certain: atomistic, programmable, deliverable, hierarchical, fixed, measurable, non-negotiable? In order to automate that knowledge, what happens to care?"

"I’ll leave you with one final quotation, from Hannah Arendt who wrote,
"Education is the point at which we decide whether we love the world enough to assume responsibility for it and by the same token save it from that ruin which, except for renewal, except for the coming of the new and young, would be inevitable. And education, too, is where we decide whether we love our children enough not to expel them from our world and leave them to their own devices, nor to strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.”

Our task, I believe, is to tell the stories and build the society that would place education technology in that same light: “renewing a common world.”

We in ed-tech must face the monsters we have created, I think. These are the monsters in the technologies of war and surveillance a la Bletchley Park. These are the monsters in the technologies of mass production and standardization. These are the monsters in the technologies of behavior modification a la BF Skinner.

These are the monsters ed-tech must face. And we must all consider what we need to do so that we do not create more of them."
audreywatters  edtech  technology  education  schools  data  monsters  dragons  frankenstein  luddites  luddism  neoluddism  alanturing  thomaspynchon  society  bfskinner  standardization  surveillance  massproduction  labor  hannaharendt  brunolatour  work  kevinkelly  technosolutionism  erikbrynjolfsson  lordbyron  maryshelley  ethics  hierarchy  children  responsibility  love  howwelearn  howweteach  teaching  learning  politics  policy  democracy  exploitation  hierarchies  progress  science  scientism  markets  aynrand  liberarianism  projectpigeon  teachingmachines  personalization  individualization  behavior  behaviorism  economics  capitalism  siliconvalley 
september 2014 by robertogreco

Copy this bookmark:

to read