recentpopularlog in

robertogreco : bigdata   46

The Stories We Were Told about Education Technology (2018)
"It’s been quite a year for education news, not that you’d know that by listening to much of the ed-tech industry (press). Subsidized by the Chan Zuckerberg Initiative, some publications have repeatedly run overtly and covertly sponsored articles that hawk the future of learning as “personalized,” as focused on “the whole child.” Some of these attempt to stretch a contemporary high-tech vision of social emotional surveillance so it can map onto a strange vision of progressive education, overlooking no doubt how the history of progressive education has so often been intertwined with race science and eugenics.

Meanwhile this year, immigrant, refugee children at the United States border were separated from their parents and kept in cages, deprived of legal counsel, deprived of access to education, deprived in some cases of water.

“Whole child” and cages – it’s hardly the only jarring juxtaposition I could point to.

2018 was another year of #MeToo, when revelations about sexual assault and sexual harassment shook almost every section of society – the media and the tech industries, unsurprisingly, but the education sector as well – higher ed, K–12, and non-profits alike, as well school sports all saw major and devastating reports about cultures and patterns of sexual violence. These behaviors were, once again, part of the hearings and debates about a Supreme Court Justice nominee – a sickening deja vu not only for those of us that remember Anita Hill ’s testimony decades ago but for those of us who have experienced something similar at the hands of powerful people. And on and on and on.

And yet the education/technology industry (press) kept up with its rosy repetition that social equality is surely its priority, a product feature even – that VR, for example, a technology it has for so long promised is “on the horizon,” is poised to help everyone, particularly teachers and students, become more empathetic. Meanwhile, the founder of Oculus Rift is now selling surveillance technology for a virtual border wall between the US and Mexico.

2018 was a year in which public school teachers all over the US rose up in protest over pay, working conditions, and funding, striking in red states like West Virginia, Kentucky, and Oklahoma despite an anti-union ruling by the Supreme Court.

And yet the education/technology industry (press) was wowed by teacher influencers and teacher PD on Instagram, touting the promise for more income via a side-hustle like tutoring rather by structural or institutional agitation. Don’t worry, teachers. Robots won’t replace you, the press repeatedly said. Unsaid: robots will just de-professionalize, outsource, or privatize the work. Or, as the AI makers like to say, robots will make us all work harder (and no doubt, with no unions, cheaper).

2018 was a year of ongoing and increased hate speech and bullying – racism and anti-Semitism – on campuses and online.

And yet the education/technology industry (press) still maintained that blockchain would surely revolutionize the transcript and help insure that no one lies about who they are or what they know. Blockchain would enhance “smart spending” and teach financial literacy, the ed-tech industry (press) insisted, never once mentioning the deep entanglements between anti-Semitism and the alt-right and blockchain (specifically Bitcoin) backers.

2018 was a year in which hate and misinformation, magnified and spread by technology giants, continued to plague the world. Their algorithmic recommendation engines peddled conspiracy theories (to kids, to teens, to adults). “YouTube, the Great Radicalizer” as sociologist Zeynep Tufekci put it in a NYT op-ed.

And yet the education/technology industry (press) still talked about YouTube as the future of education, cheerfully highlighting (that is, spreading) its viral bullshit. Folks still retyped the press releases Google issued and retyped the press releases Facebook issued, lauding these companies’ (and their founders’) efforts to reshape the curriculum and reshape the classroom.

This is the ninth year that I’ve reviewed the stories we’re being told about education technology. Typically, this has been a ten (or more) part series. But I just can’t do it any more. Some people think it’s hilarious that I’m ed-tech’s Cassandra, but it’s not funny at all. It’s depressing, and it’s painful. And no one fucking listens.

If I look back at what I’ve written in previous years, I feel like I’ve already covered everything I could say about 2018. Hell, I’ve already written about the whole notion of the “zombie idea” in ed-tech – that bad ideas never seem to go away, that just get rebranded and repackaged. I’ve written about misinformation and ed-tech (and ed-tech as misinformation). I’ve written about the innovation gospel that makes people pitch dangerously bad ideas like “Uber for education” or “Alexa for babysitting.” I’ve written about the tech industry’s attempts to reshape the school system as its personal job training provider. I’ve written about the promise to “rethink the transcript” and to “revolutionize credentialing.” I’ve written about outsourcing and online education. I’ve written about coding bootcamps as the “new” for-profit higher ed, with all the exploitation that entails. I’ve written about the dangers of data collection and data analysis, about the loss of privacy and the lack of security.

And yet here we are, with Mark Zuckerberg – education philanthropist and investor – blinking before Congress, promising that AI will fix everything, while the biased algorithms keep churning out bias, while the education/technology industry (press) continues to be so blinded by “disruption” it doesn’t notice (or care) what’s happened to desegregation, and with so many data breaches and privacy gaffes that they barely make headlines anymore.

Folks. I’m done.

I’m also writing a book, and frankly that’s where my time and energy is going.

There is some delicious irony, I suppose, in the fact that there isn’t much that’s interesting or “innovative” to talk about in ed-tech, particularly since industry folks want to sell us on the story that tech is moving faster than it’s ever moved before, so fast in fact that the ol’ factory model school system simply cannot keep up.

I’ve always considered these year-in-review articles to be mini-histories of sorts – history of the very, very recent past. Now, instead, I plan to spend my time taking a longer, deeper look at the history of education technology, with particular attention for the next few months, as the title of my book suggests, to teaching machines – to the promises that machines will augment, automate, standardize, and individualize instruction. My focus is on the teaching machines of the mid-twentieth century, but clearly there are echoes – echoes of behaviorism and personalization, namely – still today.

In his 1954 book La Technique (published in English a decade later as The Technological Society), the sociologist Jacques Ellul observes how education had become oriented towards creating technicians, less interested in intellectual development than in personality development – a new “psychopedagogy” that he links to Maria Montessori. “The human brain must be made to conform to the much more advanced brain of the machine,” Ellul writes. “And education will no longer be an unpredictable and exciting adventure in human enlightenment , but an exercise in conformity and apprenticeship to whatever gadgetry is useful in a technical world.” I believe today we call this "social emotional learning" and once again (and so insistently by the ed-tech press and its billionaire backers), Montessori’s name is invoked as the key to preparing students for their place in the technological society.

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

This is the only good grit:

[image of Gritty]

If I were writing a lengthier series on the year in ed-tech, I’d spend much more time talking about the promises made about personalization and social emotional learning. I’ll just note here that the most important “innovator” in this area this year (other than Gritty) was surely the e-cigarette maker Juul, which offered a mindfulness curriculum to schools – offered them the curriculum and $20,000, that is – to talk about vaping. “‘The message: Our thoughts are powerful and can set action in motion,’ the lesson plan states.”

The most important event in ed-tech this year might have occurred on February 14, when a gunman opened fire on his former classmates at Marjory Stone Douglas High School in Parkland, Florida, killing 17 students and staff and injuring 17 others. (I chose this particular school shooting because of the student activism it unleashed.)

Oh, I know, I know – school shootings and school security aren’t ed-tech, ed-tech evangelists have long tried to insist, an argument I’ve heard far too often. But this year – the worst year on record for school shootings (according to some calculations) – I think that argument started to shift a bit. Perhaps because there’s clearly a lot of money to be made in selling schools “security” products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, … [more]
audreywatters  education  technology  edtech  2018  surveillance  privacy  personalization  progressive  schools  quantification  gamification  wholechild  montessori  mariamontessori  eugenics  psychology  siliconvalley  history  venturecapital  highereducation  highered  guns  gunviolence  children  youth  teens  shootings  money  influence  policy  politics  society  economics  capitalism  mindfulness  juul  marketing  gritty  innovation  genetics  psychotechnologies  gender  race  racism  sexism  research  socialemotional  psychopedagogy  pedagogy  teaching  howweteach  learning  howwelearn  teachingmachines  nonprofits  nonprofit  media  journalism  access  donaldtrump  bias  algorithms  facebook  amazon  disruption  data  bigdata  security  jacquesellul  sociology  activism  sel  socialemotionallearning 
december 2018 by robertogreco
Objectivity as standardization in data-scientific education policy, technology and governance: Learning, Media and Technology: Vol 0, No 0
"New data-driven technologies appear to promise a new era of accuracy and objectivity in scientifically-informed educational policy and governance. The data-scientific objectivity sought by education policy, however, is the result of practices of standardization and quantification deployed to settle controversies about the definition and measurement of human qualities by rendering them as categories and numbers. Focusing on the emerging policy agenda of ‘social and emotional learning and skills,’ this paper examines the practices of ‘objectivity-making’ underpinning this new field. Objectivity-making depends on three translations of (1) scientific expertise into standardized and enumerable definitions, (2) standardization into measurement technologies, and (3) the data produced through measurement technologies into objective policy-relevant knowledge, which consolidates a market in SEL technologies. The paper sheds light on knowledge-making practices in the era of big data and policy science, and their enduring reliance on the precarious construction of objectivity as a key legitimator of policy-relevant scientific knowledge and ‘evidence-based’ education governance."
data  education  policy  objectivity  evidence  schools  schooling  scientism  benwilliamson  nellipiattoeva  technology  quantification  measurement  bigdata  edtech  standardization  standards 
december 2018 by robertogreco
James Bridle on New Dark Age: Technology and the End of the Future - YouTube
"As the world around us increases in technological complexity, our understanding of it diminishes. Underlying this trend is a single idea: the belief that our existence is understandable through computation, and more data is enough to help us build a better world.

In his brilliant new work, leading artist and writer James Bridle surveys the history of art, technology, and information systems, and reveals the dark clouds that gather over our dreams of the digital sublime."
quantification  computationalthinking  systems  modeling  bigdata  data  jamesbridle  2018  technology  software  systemsthinking  bias  ai  artificialintelligent  objectivity  inequality  equality  enlightenment  science  complexity  democracy  information  unschooling  deschooling  art  computation  computing  machinelearning  internet  email  web  online  colonialism  decolonization  infrastructure  power  imperialism  deportation  migration  chemtrails  folkliterature  storytelling  conspiracytheories  narrative  populism  politics  confusion  simplification  globalization  global  process  facts  problemsolving  violence  trust  authority  control  newdarkage  darkage  understanding  thinking  howwethink  collapse 
september 2018 by robertogreco
lalitha vasudevan on Twitter: "Overhearing tutoring session between adult tutor & suburban hs student. I despair at the extensive focus on relatability (between student & text) as strategy for responding to comprehension questions and essay writing, where
"Overhearing tutoring session between adult tutor & suburban hs student. I despair at the extensive focus on relatability (between student & text) as strategy for responding to comprehension questions and essay writing, wherein to relate to have personally experienced.

1/

Being able to relate, in and of itself, isn't the cause of my despair. It's the over-reliance on experience to the exclusion of other ways of creating conditions for understanding that worries me. This bent away from the traps of "cultural literacy" began w/good intentions;

2/

but this response -- understandably, in resistance to the hyper-testing mania that overtook and still dominates much of the schooling landscape -- may err too far in the direction of allowing some young people to never have to stray too far from their own thoughts.

3/

I want to know what young people think, what they notice and see, how they navigate and experience the world. AND, I want their insights on what others notice, see, conclude, design, and decide; for that, too, concerns young people --

4/

not only in their immediate, local, kinship networks, but about how they perceive others' perceptions of the they things they have noticed, or not. They are civic beings, active in their citizenry, and to deny this and allow otherwise is educational malpractice.

5/

I want young people to be seen and engaged as real interlocutors, not discursive window dressing to be written into curricula and grant proposals as the "participatory" element. I don't just want to hear what they think; I want to think with them, toward new questions.

6/

So, I return to a familiar, frustrating thought: My, how standardization, answer-driven teaching, & the greedy pursuit of efficiency-driven uniformity has royally screwed over kids & schools.
And (some) big data efforts want to help do more of the same.

7/7
#smalldatabigmoments"
lalithavasudevan  education  standardizedtesting  standardization  experience  relatability  teaching  learning  schools  schooliness  kinship  perception  culturalliteracy  howweteach  howwelearn  comprehension  essays  writing  howwewrite  teachingreading  teachingwriting  noticing  civics  citizenship  democracy  democratic  malpractice  participatory  participation  unschooling  deschooling  pedagogy  uniformity  efficiency  bigdata  testing 
august 2018 by robertogreco
Podcast, Nick Seaver: “What Do People Do All Day?” - MIT Comparative Media Studies/Writing
"The algorithmic infrastructures of the internet are made by a weird cast of characters: rock stars, gurus, ninjas, wizards, alchemists, park rangers, gardeners, plumbers, and janitors can all be found sitting at computers in otherwise unremarkable offices, typing. These job titles, sometimes official, sometimes informal, are a striking feature of internet industries. They mark jobs as novel or hip, contrasting starkly with the sedentary screenwork of programming. But is that all they do? In this talk, drawing on several years of fieldwork with the developers of algorithmic music recommenders, Seaver describes how these terms help people make sense of new kinds of jobs and their positions within new infrastructures. They draw analogies that fit into existing prestige hierarchies (rockstars and janitors) or relationships to craft and technique (gardeners and alchemists). They aspire to particular imaginations of mastery (gurus and ninjas). Critics of big data have drawn attention to the importance of metaphors in framing public and commercial understandings of data, its biases and origins. The metaphorical borrowings of role terms serve a similar function, highlighting some features at the expense of others and shaping emerging professions in their image. If we want to make sense of new algorithmic industries, we’ll need to understand how they make sense of themselves.

Nick Seaver is assistant professor of anthropology at Tufts University. His current research examines the cultural life of algorithms for understanding and recommending music. He received a masters from CMS in 2010 for research on the history of the player piano."

[direct link to audio: https://soundcloud.com/mit-cmsw/nick-seaver-what-do-people-do-all-day ]

[via: https://twitter.com/allank_o/status/961382666573561856 ]
nickseaver  2016  work  labor  algorithms  bigdata  music  productivity  automation  care  maintenance  programming  computing  hierarchy  economics  data  datascience 
february 2018 by robertogreco
Frontier notes on metaphors: the digital as landscape and playground - Long View on Education
"I am concerned with the broader class of metaphors that suggest the Internet is an inert and open place for us to roam. Scott McLeod often uses the metaphor of a ‘landscape’: “One of schools’ primary tasks is to help students master the dominant information landscape of their time.”

McLeod’s central metaphor – mastering the information landscape – fits into a larger historical narrative that depicts the Internet as a commons in the sense of “communally-held space, one which it is specifically inappropriate for any single individual or subset of the community (including governments) to own or control.” Adriane Lapointe continues, “The internet is compared to a landscape which can be used in various ways by a wide range of people for whatever purpose they please, so long as their actions do not interfere with the actions of others.”

I suspect that the landscape metaphor resonates with people because it captures how they feel the Internet should work. Sarah T. Roberts argues that we are tempted to imagine the digital as “valueless, politically neutral and as being without material consequences.” However, the digital information landscape is an artifact shaped by capitalism, the US military, and corporate power. It’s a landscape that actively tracks and targets us, buys and sells our information. And it’s mastered only by the corporations, CEOs and venture capitalists.

Be brave? I have no idea what it would mean to teach students how to ‘master’ the digital landscape. The idea of ‘mastering’ recalls the popular frontier and pioneer metaphors that have fallen out of fashion since 1990s as the Internet became ubiquitous, as Jan Rune Holmevik notes. There is of course a longer history of the “frontiers of knowledge” metaphor going back to Francis Bacon and passing through Vannevar Bush, and thinking this way has become, according to Gregory Ulmer, “ubiquitous, a reflex, a habit of mind that shapes much of our thinking about inquiry” – and one that needs to be rethought if we take the postcolonial movement seriously.

While we might worry about being alert online, we aren’t exposed to enough stories about the physical and material implications of the digital. It’s far too easy to think that the online landscape exists only on our screens, never intersecting with the physical landscape in which we live. Yet, the Washington Post reports that in order to pave the way for new data centers, “the Prince William County neighborhood [in Virginia] of mostly elderly African American homeowners is being threatened by plans for a 38-acre computer data center that will be built nearby. The project requires the installation of 100-foot-high towers carrying 230,000-volt power lines through their land. The State Corporation Commission authorized Dominion Virginia Power in late June to seize land through eminent domain to make room for the towers.” In this case, the digital is transforming the physical landscape with hostile indifference to the people that live there.

Our students cannot be digitally literate citizens if they don’t know stories about the material implications about the digital. Cathy O’Neil has developed an apt metaphor for algorithms and data – Weapons of Math Destruction – which have the potential to destroy lives because they feed on systemic biases. In her book, O’Neil explains that while attorneys cannot cite the neighborhood people live in as a reason to deny prisoners parole, it is permissible to package that judgment into an algorithm that generates a prediction of recidivism."



"When I talk to students about the implications of their searches being tracked, I have no easy answers for them. How can youth use the net for empowerment when there’s always the possibility that their queries will count against them? Yes, we can use google to ask frank questions about our sexuality, diet, and body – or any of the other ways we worry about being ‘normal’ – but when we do so, we do not wander a non-invasive landscape. And there few cues that we need to be alert or smart.

Our starting point should not be the guiding metaphors of the digital as a playground where we need to practice safety or a landscape that we can master, but Shoshana Zuboff’s analysis of surveillance capitalism: “The game is selling access to the real-time flow of your daily life –your reality—in order to directly influence and modify your behavior for profit. This is the gateway to a new universe of monetization opportunities: restaurants who want to be your destination. Service vendors who want to fix your brake pads. Shops who will lure you like the fabled Sirens.”



So what do we teach students? I think that Chris Gilliard provides the right pedagogical insight to end on:
Students are often surprised (and even angered) to learn the degree to which they are digitally redlined, surveilled, and profiled on the web and to find out that educational systems are looking to replicate many of those worst practices in the name of “efficiency,” “engagement,” or “improved outcomes.” Students don’t know any other web—or, for that matter, have any notion of a web that would be different from the one we have now. Many teachers have at least heard about a web that didn’t spy on users, a web that was (theoretically at least) about connecting not through platforms but through interfaces where individuals had a significant amount of choice in saying how the web looked and what was shared. A big part of the teaching that I do is to tell students: “It’s not supposed to be like this” or “It doesn’t have to be like this.”
"
banjamindoxtdator  2017  landscapes  playgrounds  georgelakoff  markjohnson  treborscolz  digitalcitizenship  internet  web  online  mckenziewark  privacy  security  labor  playbor  daphnedragona  gamification  uber  work  scottmcleod  adrianelapointe  sarahroberts  janruneholmevik  vannevabush  gregoryulmer  francisbacon  chrisgilliard  pedagogy  criticalthinking  shoshanazuboff  surveillance  surveillancecapitalism  safiyanoble  google  googleglass  cathyo'neil  algorithms  data  bigdata  redlining  postcolonialism  race  racism  criticaltheory  criticalpedagogy  bias 
july 2017 by robertogreco
BBC Radio 4 - FutureProofing, The Future of the Future
"Does the accelerating pace of technology change the way we think about the future?

It's said that science fiction writers now spend more time telling stories about today than about tomorrow, because the potential of existing technology to change our world is so rich that there is no need to imagine the future - it's already here. Does this mean the future is dead? Or that we are experiencing a profound shift in our understanding of what the future means to us, how it arrives, and what forces will shape it?

Presenters Timandra Harkness and Leo Johnson explore how our evolving understanding of time and the potential of technological change are transforming the way we think about the future."
future  2017  mattnovak  sciencefiction  scifi  timandraharkness  leojohnson  time  technology  learning  howwelive  change  1960s  1950s  alexanerrose  prediction  bigdata  stability  flexibility  adaptability  astroteller  googlex  longnow  longnowfoundation  uncertainty  notknowing  simulation  generativedesign  dubai  museumofthefuture  agency  lawrenceorsini  implants  douglascoupland  belllabs  infrastructure  extremepresent  sfsh  classideas  present  past  history  connectivity  internet  web  online  futurism  futures  smartphones  tv  television  refrigeration  seancarroll 
may 2017 by robertogreco
Trumped Up Data
"I’ve started working on my annual review of the year in ed-tech, something I’ve done for the past six years. It’s an intensive project – I will write some 75,000 words between now and the end of December – that forces me to go back through all the events and announcements of the previous twelve months. I don’t do so to make predictions about the future. But rather I look for patterns so that I can better understand how the past might orient us towards certain futures. I listen closely to the stories that we have told ourselves about education and technology, about the various possible futures in which these two systems (these two sets of practices, these two sets of ideologies) are so deeply intertwined. I pay attention to who tells the stories, who shares the stories, who believes the stories. In thinking about the past, I am always thinking about the future; in thinking about the future, we are always talking about the past.

That’s what’s at the core of a slogan like “Make America Great Again,” of course. It invokes a nostalgic longing for a largely invented past as it gestures towards a future that promises “greatness” once again.

Last week – and it feels so long right now – I gave a talk titled “The Best Way to Predict the Future is to Issue a Press Release.” I argued there’s something frighteningly insidious about the ways in which predictions about the future of education and technology are formulated and spread. These predictions are predicated on a destabilization or disruption of our public institutions and an entrenchment of commodification and capitalism.

These predictions don’t have to be believable or right; indeed, they rarely are. But even when wrong, they push the future in a certain direction. And they reveal the shape that the storytellers want the future to take.

In my talk, I called these predictions a form of “truthiness.” I’d add to that, an observation that sociologist Nathan Jurgenson made last night about “factiness”:
On the right, they have what Stephan Colbert called “truthiness,” which we might define as ignoring facts in the name of some larger truth. The facts of Obama’s birthplace mattered less for them than their own racist “truth” of white superiority. Perhaps we need to start articulating a left-wing version of truthiness: let’s call it “factiness.” Factiness is the taste for the feel and aesthetic of “facts,” often at the expense of missing the truth. From silly self-help-y TED talks to bad NPR-style neuroscience science updates to wrapping ourselves in the misleading scientism of Fivethirtyeight statistics, factiness is obsessing over and covering ourselves in fact after fact while still missing bigger truths.

“Factiness” connects to a lot of what we saw in this election, to be sure – this faith, as Jurgenson points out, in polling despite polling being wrong repeatedly, all along. It connects to a lot of what we hear in technology circles too – that we can build intelligent systems that model and adapt and learn and predict complex human behaviors. And that, in turn, is connected to education’s long-standing obsession with data: that we can harness elaborate analytics and measurement tools to identify who’s learning and who’s not.

I don’t believe that answers are found in “data” (that is, in “data” as this pure objective essence of “fact” or “truth”). Rather, I believe answers – muddier and more mutable and not really answers at all – live in stories. It is, after all, in stories where we find what underpins and extends both “truthiness” and “factiness.” Stories are crafted and carried in different ways, no doubt, than “data,” even when they serve the same impulse – to control, to direct.

Stories are everywhere, and yet stories can be incredibly easy to dismiss.
We do not listen.

Sometimes I joke that I’ve been described as “ed-tech’s Cassandra.” Mostly, it’s unfunny – not much of a joke at all considering how things worked out for poor Cassandra. But I do listen closely to the stories being told about the future of education and technology, and all I can do is to caution people that these stories rely on some fairly dystopian motifs and outcomes.

I’m also a folklorist, an ethnographer. I approach education technology with that disciplinary training. I listen to the stories. I observe the practices. I talk to people.

I’m not sure how to move forward after last night’s election results. For now, all I have is this: I want to remind people of the importance of stories – that stories might be better to turn to for understanding the future people want, better than the data we’ve been so obsessed with watching as a proxy for actually talking or listening to them."
audreywatters  2016  data  elections  edtech  truthiness  factiness  listening  nathanjurgenson  ethnography  folklore  storytelling  stories  bigdata  predictions  understanding  truth  stephencolbert 
november 2016 by robertogreco
Critical Algorithm Studies: a Reading List | Social Media Collective
"This list is an attempt to collect and categorize a growing critical literature on algorithms as social concerns. The work included spans sociology, anthropology, science and technology studies, geography, communication, media studies, and legal studies, among others. Our interest in assembling this list was to catalog the emergence of “algorithms” as objects of interest for disciplines beyond mathematics, computer science, and software engineering.

As a result, our list does not contain much writing by computer scientists, nor does it cover potentially relevant work on topics such as quantification, rationalization, automation, software more generally, or big data, although these interests are well-represented in these works’ reference sections of the essays themselves.

This area is growing in size and popularity so quickly that many contributions are popping up without reference to work from disciplinary neighbors. One goal for this list is to help nascent scholars of algorithms to identify broader conversations across disciplines and to avoid reinventing the wheel or falling into analytic traps that other scholars have already identified. We also thought it would be useful, especially for those teaching these materials, to try to loosely categorize it. The organization of the list is meant merely as a first-pass, provisional sense-making effort. Within categories the entries are offered in chronological order, to help make sense of these rapid developments.

In light of all of those limitations, we encourage you to see it as an unfinished document, and we welcome comments. These could be recommendations of other work to include, suggestions on how to reclassify a particular entry, or ideas for reorganizing the categories themselves. Please use the comment space at the bottom of the page to offer suggestions and criticism; we will try to update the list in light of these suggestions.

Tarleton Gillespie and Nick Seaver"
algorithms  bibliography  ethics  bigdata  tarletongillespie  nickseaver  2016  sociology  anthropology  science  technology  criticalalgorithmstudies  via:tealtan 
june 2016 by robertogreco
Data USA
"In 2014, Deloitte, Datawheel, and Cesar Hidalgo, Professor at the MIT Media Lab and Director of MacroConnections, came together to embark on an ambitious journey -- to understand and visualize the critical issues facing the United States in areas like jobs, skills and education across industry and geography. And, to use this knowledge to inform decision making among executives, policymakers and citizens.

Our team, comprised of economists, data scientists, designers, researchers and business executives, worked for over a year with input from policymakers, government officials and everyday citizens to develop Data USA, the most comprehensive website and visualization engine of public US Government data. Data USA tells millions of stories about America. Through advanced data analytics and visualization, it tells stories about: places in America—towns, cities and states; occupations, from teachers to welders to web developers; industries--where they are thriving, where they are declining and their interconnectedness to each other; and education and skills, from where is the best place to live if you’re a computer science major to the key skills needed to be an accountant.

Data USA puts public US Government data in your hands. Instead of searching through multiple data sources that are often incomplete and difficult to access, you can simply point to Data USA to answer your questions. Data USA provides an open, easy-to-use platform that turns data into knowledge. It allows millions of people to conduct their own analyses and create their own stories about America – its people, places, industries, skill sets and educational institutions. Ultimately, accelerating society’s ability to learn and better understand itself.

How can Data USA be useful? If you are an executive, it can help you better understand your customers and talent pool. It can inform decisions on where to open or relocate your business or plant. You may also want to build on the Data USA platform using the API and integrate additional data. If you are a recent college graduate, Data USA can help you find locations with the greatest opportunities for the job you want and the major you have. If you are a policymaker, Data USA can be a powerful input to economic and workforce development programs. Or, you may be a public health professional and want to dive into behavioral disease patterns across the country. These are just a few examples of how an open data platform like Data USA can benefit everyday citizens, business and government.

About Deloitte
Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. Please see www.deloitte.com/about for a detailed description of DTTL and its member firms. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.

About Macro Connections
The Macro Connections group focuses on the development of analytical tools that can help improve our understanding of the world's macro structures in all of their complexity. By developing methods to analyze and represent networks—such as the networks connecting countries to the products they export, or historical characters to their peers—Macro Connections research aims to help improve our understanding of the world by putting together the pieces that our scientific disciplines have helped to pull apart. Click here to learn more.

About Datawheel
Datawheel is a small but mighty crew of programmers and designers with a passion for crafting data into predictive, decision-making, and storytelling tools. Every visualization platform they build is a tailored solution that marries the needs of users and the data supporting it. Click here to learn more.

About the Visualizations
The visualizations in Data USA are powered by D3plus, an open-source visualization engine that was created by members of the Datawheel team."
us  data  visualization  via:shannon_mattern  analytics  opendata  bigdata  datausa 
april 2016 by robertogreco
'I Love My Label': Resisting the Pre-Packaged Sound in Ed-Tech
"I’ve argued elsewhere, drawing on a phrase by cyborg anthropologist Amber Case, that many of the industry-provided educational technologies we use create and reinforce a “templated self,” restricting the ways in which we present ourselves and perform our identities through their very technical architecture. The learning management system is a fine example of this, particularly with its “permissions” that shape who gets to participate and how, who gets to create, review, assess data and content. Algorithmic profiling now will be layered on top of these templated selves in ed-tech – the results, again: the pre-packaged student.

Indie ed-tech, much like the indie music from which it takes its inspiration, seeks to offer an alternative to the algorithms, the labels, the templates, the profiling, the extraction, the exploitation, the control. It’s a big task – an idealistic one, no doubt. But as the book Our Band Could Be Your Life, which chronicles the American indie music scene of the 1980s (and upon which Jim Groom drew for his talk on indie-ed tech last fall), notes, “Black Flag was among the first bands to suggest that if you didn’t like ‘the system,’ you should simply create one of your own.” If we don’t like ‘the system’ of ed-tech, we should create one of our own.

It’s actually not beyond our reach to do so.

We’re already working in pockets doing just that, with various projects to claim and reclaim and wire and rewire the Web so that it’s more just, more open, less exploitative, and counterintuitively perhaps less “personalized.” “The internet is shit today,” Pirate Bay founder Peter Sunde said last year. “It’s broken. It was probably always broken, but it’s worse than ever.” We can certainly say the same for education technology, with its long history of control, measurement, standardization.

We aren’t going to make it better by becoming corporate rockstars. This fundamental brokenness means we can’t really trust those who call for a “Napster moment” for education or those who hail the coming Internet/industrial revolution for schools. Indie means we don’t need millions of dollars, but it does mean we need community. We need a space to be unpredictable, for knowledge to be emergent not algorithmically fed to us. We need intellectual curiosity and serendipity – we need it from scholars and from students. We don’t need intellectual discovery to be trademarked, to a tab that we click on to be fed the latest industry updates, what the powerful, well-funded people think we should know or think we should become."
2016  audreywatters  edupunk  edtech  independent  indie  internet  online  technology  napster  history  serendipity  messiness  curiosity  control  measurement  standardization  walledgardens  privacy  data  schools  education  highered  highereducation  musicindustry  jimgroom  ambercase  algorithms  bigdata  prediction  machinelearning  machinelistening  echonest  siliconvalley  software 
march 2016 by robertogreco
Personal and Personalized Learning ~ Stephen Downes
"We hear the phrase ‘personalized learning’ a lot these days, so much so that it has begun to lose its meaning. Wikipedia tells us that it is the “tailoring of pedagogy, curriculum and learning environments by learners or for learners in order to meet their different learning needs and aspirations.” i

Even this short definition provides us with several dimensions across which personalization may be defined. Each of these has been the subject of considerable debate in the field:
• Pedagogy – do we need to differentiate instruction according to student variables or ‘learning styles’, or is this all a big myth?
• Curriculum – should students study the same subjects in the same order, beginning with ‘foundational’ subjects such as reading or mathematics, or can we vary this order for different students?
• Learning environments – should students work in groups in a collaborative classroom, or can they learn on their own at home or with a computer?

In personalized learning today, the idea is to enable technology to make many of these decisions for us. For example, adaptive learning entails the presentation of different course content based on a student’s prior experience or performance in learning tasks.

What these approaches have in common, though, is that in all cases learning is something that is provided to the learner by some educational system, whether it be a school and a teacher, or a computer and adaptive learning software. And these providers work from a standard model of what should be provided and how it should be provided, and adapt and adjust it according to a set of criteria. These criteria are determined by measuring some aspect of the student’s performance.

This is why we read a lot today about ‘learning analytics’ and ‘big data’. The intent behind such systems is to use the data collected from a large number of students working in similar learning environments toward similar learning outcomes in order to make better recommendations to future students. The ‘optimized learning path’ for any given learner is found by analyzing the most successful path followed by the most similar students.

It’s an open question whether we improve learning employing such methods. Presumably, using trial and error, and employing a wide variety of pedagogical, curricular and environmental variables, we could come upon some statistically significant results. But the question is whether we should apply these methods, for two reasons.

First, individual variability outweighs statistical significance. We see this in medicine. While, statistically, a certain treatment might make the most sense, no doctor would prescribe such a treatment without first assessing the individual and making sure that the generalization actually applies, because in many cases it doesn’t, and the doctor is sworn to ‘do no harm’.

Second, and perhaps more importantly, it shouldn’t be up to the education system to determine what a person learns, how they learn it, and where. Many factors go into such decisions: individual preferences, social and parental expectations, availability of resources, or employability and future prospects. The best educational outcome isn’t necessarily the best outcome.

For these reasons, it may be preferably to embrace an alternative to personalized learning, which might be called personal learning. In the case of personal learning, the role of the educational system is not to provide learning, it is to support learning. Meanwhile, the decisions about what to learn, how to learn, and where to learn are made outside the educational system, and principally, by the individual learners themselves.

Personal learning often begins informally, on an ad hoc basis, driven by the need to complete some task or achieve some objective. The learning is a means to an end, rather than the end in itself. Curricula and pedagogy are selected pragmatically. If the need is short term and urgent, a simple learning resource may be provided. If the person wants to understand at a deep level, then a course might be the best option.

Personalized learning is like being served at a restaurant. Someone else selects the food and prepares it. There is some customization – you can tell the waiter how you want your meat cooked – but essentially everyone at the restaurant gets the same experience.

Personal learning is like shopping at a grocery store. You need to assemble the ingredients yourself and create your own meals. It’s harder, but it’s a lot cheaper, and you can have an endless variety of meals. Sure, you might not get the best meals possible, but you control the experience, and you control the outcome.

When educators and policy-makers talk about personalized learning, they frequently focus on the quality of the result. But this is like everybody should eat at restaurants in order to be sure they always get the healthiest meal possible. It may seem like the best option, but even the best restaurant can’t cater to the wide range of different tastes and nutritional needs, and no restaurant will help the person learn to cook for themselves.

Ultimately, if people are to become effective learners, they need to be able to learn on their own. They need to be able to find the resources they need, assemble their own curriculum, and forge their own learning path. They will not be able to rely on education providers, because their needs are too many and too varied. "
2016  education  teaching  learning  differentiation  personallearning  personalization  personalizedlearning  unschooling  deschooling  independence  schools  stephendowns  lcproject  openstudioproject  pedagogy  curriculum  adhoc  informallearning  decisionmaking  self-directed  self-directedlearning  tcsnmy  howwelearn  howweteach  data  bigdata  measurement  analytics  sfsh 
february 2016 by robertogreco
Digital Manifesto Archive
"This collection aggregates manifestos concerned with making as a subpractice of the digital humanities."



"This archive is an academic resource dedicated to aggregating and cataloging manifestos that fall under two basic criteria. 1) The Digital Manifesto Archive features manifestos that focus on the political and cultural dimensions of digital life. 2) The Digital Manifesto Archive features manifestos that are written, or are primarily disseminated, online.

The manifesto genre is, by definition, timely and politically focused. Further, it is a primary site of political, cultural, and social experimentation in our contemporary world. Manifestos that are created and disseminated online further this experimental ethos by fundamentally expanding the character and scope of the genre.

Each category listed on the archive is loosely organized by theme, political affiliation, and (if applicable) time period. While the political movements and affiliations of the manifestos archived in each category are not universal, each category does try to capture a broad spectrum of political moods and actions with regard to its topic.

This site is meant to preserve manifestos for future research and teaching. The opinions expressed by each author are their own.

This archive was created by Matt Applegate. Our database and website was created by Graham Higgins (gwhigs). It is maintained by Matt Applegate and Yu Yin (Izzy) To
You can contact us at digitalmanifestoarchive@gmail.com.

This project is open source. You can see gwhigs' work for the site here: Digital Manifesto Archive @ Github.com"
manifestos  digital  digitalhumanities  archives  making  mattapplegate  yuyin  designfiction  criticalmaking  engineering  capitalism  feminism  hacking  hacktivism  digitalmarkets  digitaldiaspora  internetofthings  iot  cyberpunk  mediaecology  media  publishing  socialmedia  twitter  ethics  digitalculture  piracy  design  bigdata  transhumanism  utopianism  criticaltheory  mediaarchaeology  opensource  openaccess  technofeminism  gaming  digitalaesthetics  digitaljournalism  journalism  aesthetics  online  internet  web  technocracy  archaeology  education  afrofuturism  digitalart  art  blogging  sopa  aaronswartz  pipa  anarchism  anarchy 
february 2016 by robertogreco
What World Are We Building? — Data & Society: Points — Medium
"It’s easy to love or hate technology, to blame it for social ills or to imagine that it will fix what people cannot. But technology is made by people. In a society. And it has a tendency to mirror and magnify the issues that affect everyday life. The good, bad, and ugly."



"1. Inequity All Over Again

While social media was being embraced, I was doing research, driving around the country talking with teenagers about how they understood technology in light of everything else taking place in their lives. I watched teens struggle to make sense of everyday life and their place in it. And I watched as privileged parents projected their anxieties onto the tools that were making visible the lives of less privileged youth.

As social media exploded, our country’s struggle with class and race got entwined with technology. I will never forget sitting in small town Massachusetts in 2007 with a 14-year-old white girl I call Kat. Kat was talking about her life when she made a passing reference to why her friends had all quickly abandoned MySpace and moved to Facebook: because it was safer, and MySpace was boring. Whatever look I gave her at that moment made her squirm. She looked down and said,
I’m not really into racism, but I think that MySpace now is more like ghetto or whatever, and…the people that have Facebook are more mature… The people who use MySpace — again, not in a racist way — but are usually more like [the] ghetto and hip-hop/rap lovers group.'


As we continued talking, Kat became more blunt and told me that black people use MySpace and white people use Facebook.

Fascinated by Kat’s explanation and discomfort, I went back to my field notes. Sure enough, numerous teens had made remarks that, with Kat’s story in mind, made it very clear that a social division had unfolded between teens using MySpace and Facebook during the 2006–2007 school year. I started asking teens about these issues and heard many more accounts of how race affected engagement. "



"The techniques we use at Crisis Text Line are the exact same techniques that are used in marketing. Or personalized learning. Or predictive policing. Predictive policing, for example, involves taking prior information about police encounters and using that to make a statistical assessment about the likelihood of crime happening in a particular place or involving a particular person. In a very controversial move, Chicago has used such analytics to make a list of people most likely to be a victim of violence. In an effort to prevent crime, police officers approached those individuals and used this information in an effort to scare them to stay out of trouble. But surveillance by powerful actors doesn’t build trust; it erodes it. Imagine that same information being given to a social worker. Even better, to a community liaison. Sometimes, it’s not the data that’s disturbing, but how it’s used and by whom.

3. The World We’re Creating

Knowing how to use data isn’t easy. One of my colleagues at Microsoft Research — Eric Horvitz — can predict with startling accuracy whether someone will be hospitalized based on what they search for. What should he do with that information? Reach out to people? That’s pretty creepy. Do nothing? Is that ethical? No matter how good our predictions are, figuring out how to use them is a complex social and cultural issue that technology doesn’t solve for us. In fact, as it stands, technology is just making it harder for us to have a reasonable conversation about agency and dignity, responsibility and ethics.

Data is power. Increasingly we’re seeing data being used to assert power over people. It doesn’t have to be this way, but one of the things that I’ve learned is that, unchecked, new tools are almost always empowering to the privileged at the expense of those who are not.

For most media activists, unfettered Internet access is at the center of the conversation, and that is critically important. Today we’re standing on a new precipice, and we need to think a few steps ahead of the current fight.

We are moving into a world of prediction. A world where more people are going to be able to make judgments about others based on data. Data analysis that can mark the value of people as worthy workers, parents, borrowers, learners, and citizens. Data analysis that has been underway for decades but is increasingly salient in decision-making across numerous sectors. Data analysis that most people don’t understand.

Many activists will be looking to fight the ecosystem of prediction — and to regulate when and where prediction can be used. This is all fine and well when we’re talking about how these technologies are designed to do harm. But more often than not, these tools will be designed to be helpful, to increase efficiency, to identify people who need help. Their positive uses will exist alongside uses that are terrifying. What do we do?

One of the most obvious issues is the limited diversity of people who are building and using these tools to imagine our future. Statistical and technical literacy isn’t even part of the curriculum in most American schools. In our society where technology jobs are high-paying and technical literacy is needed for citizenry, less than 5% of high schools offer AP computer science courses. Needless to say, black and brown youth are much less likely to have access, let alone opportunities. If people don’t understand what these systems are doing, how do we expect people to challenge them?

We must learn how to ask hard questions of technology and of those making decisions based data-driven tech. And opening the black box isn’t enough. Transparency of data, algorithms, and technology isn’t enough. We need to build assessment into any system that we roll-out. You can’t just put millions of dollars of surveillance equipment into the hands of the police in the hope of creating police accountability, yet, with police body-worn cameras, that’s exactly what we’re doing. And we’re not even trying to assess the implications. This is probably the fastest roll-out of a technology out of hope, and it won’t be the last. How do we get people to look beyond their hopes and fears and actively interrogate the trade-offs?

Technology plays a central role — more and more — in every sector, every community, every interaction. It’s easy to screech in fear or dream of a world in which every problem magically gets solved. To make the world a better place, we need to start paying attention to the different tools that are emerging and learn to frame hard questions about how they should be put to use to improve the lives of everyday people.

We need those who are thinking about social justice to understand technology and those who understand technology to commit to social justice."
danahboyd  inequality  technology  2016  facebook  myspace  race  racism  prejudice  whiteflight  bigdata  indifference  google  web  online  internet  christinaxu  bias  diversity  socialjustice 
february 2016 by robertogreco
Tracing You (2015) -- by Benjamin Grosser
"computational surveillance system

Tracing You presents a website’s best attempt to see the world from its visitors’ viewpoints. By cross referencing visitor IP addresses with available online data sources, the system traces each visitor back through the network to its possible origin. The end of that trace is the closest available image that potentially shows the visitor’s physical environment. Sometimes what this image shows is eerily accurate; other times it is wildly dislocated. What can a computational system know of our environment based on the traces we leave behind? Why might it want to see where we are? How accurate are the system’s data sources and when might they improve? Finally, what does this site’s attempt to trace its visitors reveal about who (or what) is reading the web? By showing how far it sees in real-time, Tracing You provokes these questions and more.

How it Works
Every time you visit a website, the computer serving that site records data about the visit. One piece of that data is the visitor’s Internet Protocol (IP) address. A numerical string (e.g. 203.0.113.4), the IP address uniquely identifies the device used to view the site, whether it’s your phone, laptop, or tablet. Every IP address is registered with the Internet Assigned Numbers Authority, and thus has data associated with the registration. Tracing You starts with this IP address and follows the trail it leaves. First it looks up the IP address using ipinfo to obtain geolocation. This is represented as a latitude/longitude pair (e.g. 48.8631831,2.3629368) that identifies a precise location on the earth. The latitude/longitude is sent to Google, where it queries the Street View, Static Maps, and Javascript Maps data services. Using these services, Tracing You searches for the closest available match it can find, whether it’s a street image in front of the location, an interior image inside the location, or, if nothing else, a satellite image from above (e.g. many locations in China). Once found, this image is combined with text information from ipinfo and shown on the Tracing You interface.

These queries happen so quickly that when you look at the Tracing You interface you should see an image related to you. You will be the site’s most recent visitor at that moment. The image you see may be very close to your current location, or even photographed from within the building you are in at that moment. Alternatively, the image may be down the block, a few blocks over, or even further. How close it gets is very much dependent on how networks are built, configured, operated, and distributed where you are, which network you use, and the accuracy of the data associated with those networks. The more you look at the site, the more it looks back at you. Big data is continually refining its “picture” of the world. As that picture becomes more resolved, Tracing You will get more accurate. As new data sources become available, I will integrate them into the work."

[See also: http://bengrosser.com/projects/tracing-you/ ]
2015  benjamingrosser  google  internet  ip  maps  mapping  googlestreetview  streetview  data  ipaddresses  bigdata  networks  online 
january 2016 by robertogreco
Smart Pipe | Infomercials | Adult Swim - YouTube
"Everything in our lives is connected to the internet, so why not our toilets? Take a tour of Smart Pipe, the hot new tech startup that turns your waste into valuable information and fun social connectivity."
adultswim  designfiction  2014  data  bigdata  privacy  smartcities  internetofthings  iot  information  connectivity 
december 2015 by robertogreco
Haunted By Data
[https://www.youtube.com/watch?v=GAXLHM-1Psk
https://www.oreilly.com/ideas/haunted-by-data ]

"You're thinking, okay Maciej, your twelve minutes of sophistry and labored analogies have convinced me that my entire professional life is a lie. What should I do about it?

I hope to make you believe data collection is a trade-off. It hurts the people whose data you collect, but it also hurts your ability to think clearly. Make sure that it's worth it!

I'm not claiming that the sponsors of this conference are selling you a bill of goods. I'm just heavily implying it.

Here's what I want you do specifically:

Don't collect it!

If you can get away with it, just don't collect it! Just like you don't worry about getting mugged if you don't have any money, your problems with data disappear if you stop collecting it.

Switch from the hoarder's mentality of 'keep everything in case it comes in handy' to a minimalist approach of collecting only what you need.

Your marketing team will love you. They can go tell your users you care about privacy!

If you have to collect it, don't store it!

Instead of stocks and data mining, think in terms of sampling and flows. "Sampling and flows" even sounds cooler. It sounds like hip-hop!

You can get a lot of mileage out of ephemeral data. There's an added benefit that people will be willing to share things with you they wouldn't otherwise share, as long as they can believe you won't store it. All kinds of interesting applications come into play.

If you have to store it, don't keep it!

Certainly don't keep it forever. Don't sell it to Acxiom! Don't put it in Amazon glacier and forget it.

I believe there should be a law that limits behavioral data collection to 90 days, not because I want to ruin Christmas for your children, but because I think it will give us all better data while clawing back some semblance of privacy.

Finally, don't be surprised. The current model of total surveillance and permanent storage is not tenable.

If we keep it up, we'll have our own version of Three Mile Island, some widely-publicized failure that galvanizes popular opinion against the technology.

At that point people who are angry, mistrustful, and may not understand a thing about computers will regulate your industry into the ground. You'll be left like those poor saps who work in the nuclear plants, who have to fill out a form in triplicate anytime they want to sharpen a pencil.

You don't want that. Even I don't want that.

We can have that radiant future but it will require self-control, circumspection, and much more concern for safety that we've been willing to show.

It's time for us all to take a deep breath and pull off those radium underpants.

Thank you very much for your time, and please enjoy the rest of your big data conference."
maciejceglowski  data  privacy  surveillance  bigdata  2015  storage  radioactivity  datacollection  maciejcegłowski 
october 2015 by robertogreco
Is It Time to Give Up on Computers in Schools?
"This is a version of the talk I gave at ISTE today on a panel titled "Is It Time to Give Up on Computers in Schools?" with Gary Stager, Will Richardson, Martin Levins, David Thornburg, and Wayne D'Orio. It was pretty damn fun.

Take one step into that massive shit-show called the Expo Hall and it’s hard not to agree: “yes, it is time to give up on computers in schools.”

Perhaps, once upon a time, we could believe ed-tech would change things. But as Seymour Papert noted in The Children’s Machine,
Little by little the subversive features of the computer were eroded away: … the computer was now used to reinforce School’s ways. What had started as a subversive instrument of change was neutralized by the system and converted into an instrument of consolidation.

I think we were naive when we ever thought otherwise.

Sure, there are subversive features, but I think the computers also involve neoliberalism, imperialism, libertarianism, and environmental destruction. They now involve high stakes investment by the global 1% – it’s going to be a $60 billion market by 2018, we’re told. Computers are implicated in the systematic de-funding and dismantling of a public school system and a devaluation of human labor. They involve the consolidation of corporate and governmental power. They involve scientific management. They are designed by white men for white men. They re-inscribe inequality.

And so I think it’s time now to recognize that if we want education that is more just and more equitable and more sustainable, that we need to get the ideologies that are hardwired into computers out of the classroom.

In the early days of educational computing, it was often up to innovative, progressive teachers to put a personal computer in their classroom, even paying for the computer out of their own pocket. These were days of experimentation, and as Seymour teaches us, a re-imagining of what these powerful machines could enable students to do.

And then came the network and, again, the mainframe.

You’ll often hear the Internet hailed as one of the greatest inventions of mankind – something that connects us all and that has, thanks to the World Wide Web, enabled the publishing and sharing of ideas at an unprecedented pace and scale.

What “the network” introduced in educational technology was also a more centralized control of computers. No longer was it up to the individual teacher to have a computer in her classroom. It was up to the district, the Central Office, IT. The sorts of hardware and software that was purchased had to meet those needs – the needs and the desire of the administration, not the needs and the desires of innovative educators, and certainly not the needs and desires of students.

The mainframe never went away. And now, virtualized, we call it “the cloud.”

Computers and mainframes and networks are points of control. They are tools of surveillance. Databases and data are how we are disciplined and punished. Quite to the contrary of Seymour’s hopes that computers will liberate learners, this will be how we are monitored and managed. Teachers. Students. Principals. Citizens. All of us.

If we look at the history of computers, we shouldn’t be that surprised. The computers’ origins are as weapons of war: Alan Turing, Bletchley Park, code-breakers and cryptography. IBM in Germany and its development of machines and databases that it sold to the Nazis in order to efficiently collect the identity and whereabouts of Jews.

The latter should give us great pause as we tout programs and policies that collect massive amounts of data – “big data.” The algorithms that computers facilitate drive more and more of our lives. We live in what law professor Frank Pasquale calls “the black box society.” We are tracked by technology; we are tracked by companies; we are tracked by our employers; we are tracked by the government, and “we have no clear idea of just how far much of this information can travel, how it is used, or its consequences.” When we compel the use of ed-tech, we are doing this to our students.

Our access to information is constrained by these algorithms. Our choices, our students’ choices are constrained by these algorithms – and we do not even recognize it, let alone challenge it.

We have convinced ourselves, for example, that we can trust Google with its mission: “To organize the world’s information and make it universally accessible and useful.” I call “bullshit.”

Google is at the heart of two things that computer-using educators should care deeply and think much more critically about: the collection of massive amounts of our personal data and the control over our access to knowledge.

Neither of these are neutral. Again, these are driven by ideology and by algorithms.

You’ll hear the ed-tech industry gleefully call this “personalization.” More data collection and analysis, they contend, will mean that the software bends to the student. To the contrary, as Seymour pointed out long ago, instead we find the computer programming the child. If we do not unpack the ideology, if the algorithms are all black-boxed, then “personalization” will be discriminatory. As Tressie McMillan Cottom has argued “a ‘personalized’ platform can never be democratizing when the platform operates in a society defined by inequalities.”

If we want schools to be democratizing, then we need to stop and consider how computers are likely to entrench the very opposite. Unless we stop them.

In the 1960s, the punchcard – an older piece of “ed-tech” – had become a symbol of our dehumanization by computers and by a system – an educational system – that was inflexible, impersonal. We were being reduced to numbers. We were becoming alienated. These new machines were increasing the efficiency of a system that was setting us up for a life of drudgery and that were sending us off to war. We could not be trusted with our data or with our freedoms or with the machines themselves, we were told, as the punchcards cautioned: “Do not fold, spindle, or mutilate.”

Students fought back.

Let me quote here from Mario Savio, speaking on the stairs of Sproul Hall at UC Berkeley in 1964 – over fifty years ago, yes, but I think still one of the most relevant messages for us as we consider the state and the ideology of education technology:
We’re human beings!

There is a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part; you can’t even passively take part, and you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all!

We’ve upgraded from punchcards to iPads. But underneath, a dangerous ideology – a reduction to 1s and 0s – remains. And so we need to stop this ed-tech machine."
edtech  education  audreywatters  bias  mariosavio  politics  schools  learning  tressuemcmillancottom  algorithms  seymourpapert  personalization  data  security  privacy  howwteach  howwelearn  subversion  computers  computing  lms  neoliberalism  imperialism  environment  labor  publicschools  funding  networks  cloud  bigdata  google  history 
july 2015 by robertogreco
Lemming Suicide Is a Myth That Was Perpetuated by Disney
"We've all heard that lemmings jump into the sea every year, drowning themselves because they are just following the herd. Except they don't. That's actually a myth invented for a Disney wildlife documentary, and it has blinded us to the truth about the weird lives of lemmings for decades.

Lemmings are small, fluffy rodents that live mostly in the Arctic, thriving on the snowy tundra in places like Norway, northern Alaska, and Siberia. One of the great mysteries of lemmings is their odd population cycle. Like many rodents, their population expands every few years. But among lemmings, this explosion is dramatic — every few years, their population grows 100 to 1000 times larger in just one winter season.

These events are often called lemming outbreaks because the rodents will migrate all over the place looking for food, even swimming across rivers and lakes to find plants and mosses to eat. Occasionally, they fall off rocks or cliffs as they scramble to find sustenance. Then, just as abruptly, their population crashes into near-extinction.

For centuries, legends have formed around this odd cycle. Where do the lemming outbreaks come from? And what happens to all of the rodents afterwards? One popular myth was that they all just jumped into the ocean and died. Back in 1958, Disney was making a documentary about Arctic wildlife, White Wilderness, and decided that legends were as good as facts. So they brought in a truck full of lemmings to throw into the Arctic ocean. First, they put a bunch of the lemmings on a big, snowy turntable to film them "running" toward some cliffs. Then they shoved hundreds of these poor little guys over the cliff, into the ocean, where they (not surprisingly) drowned after trying to swim.

Here is the original clip from White Wilderness, below. Knowing that these lemmings were deliberately shoved over the cliff and drowned makes this pretty upsetting, so be warned before watching.

This film is what gave rise to many sayings about how lemmings follow the herd no matter what. And of course it's misled many people into thinking that lemmings commit mass suicide on a regular basis.

The reality is actually just as mysterious as the legends. Lemmings are one of the only true Arctic rodents, and they prefer to reproduce in winter. During especially cold winters, or at chilly high altitudes, they will have far more offspring. Lemming population booms, according to researchers' observations, are dependent on icy cold weather. There are likely a few reasons for this, but most important would be that they've adapted to cold weather systems — when there's a long, intense winter, these little guys breed like crazy under the snow.

As for what causes the lemmings' massive population crash — we still aren't completely sure. We know for certain that it's not mass suicide, but we also know that more adult lemmings die during outbreak years. So the population is huge, but a lot more of the animals are dying than during a typical year.

This high lemming death rate could be because the expanded population suffers food shortages, or it could be caused by predators chowing down on these tasty creatures that are suddenly everywhere underfoot. Another possibility is that there is a lot more infanticide because so many males want to mate with females — and killing a female's brood will make her ready to mate again.

Because the most intriguing part of the lemmings' lifecycle takes place under the Arctic ice, it's been hard to observe them and find out what's driving their population flux. But one thing is for certain. With the Arctic warming, there are likely to be fewer and fewer lemming outbreaks. And nobody is really sure what that will do to the typical lemming population.

For now, lemmings remain a strange and adorable mystery of the tundra."

[I never bookmarked this when I used it in January: "In 1958, Disney made a film about big data. https://www.youtube.com/watch?v=xMZlr5Gf9yY More info: http://www.snopes.com/disney/films/lemmings.asp + http://factually.gizmodo.com/lemmings-dont-commit-mass-suicide-disney-pushed-them-o-1614038696 "
https://twitter.com/rogre/status/556196381258293248 ]
lemmings  disney  1958  nature  animals  propoganda  data  bigdata  herdmentality  slander  arctic  tundra  annaleenewitz  2015 
april 2015 by robertogreco
Ideas About Education Reform: 22 Things We Do As Educators That Will Embarrass Us In 25 Years by Terry Heick
"22 Things We Do As Educators That Will Embarrass Us In 25 Years
by Terry Heick

Saw a picture today from the 1970s of a mother driving her car with her newborn baby in the passenger seat (no car seat). This, of course, got me thinking about education. What do we do now that in 25 years we’ll look back on and shake our heads? What are our “doctors smoking cigarettes while giving check ups” moments? I have a feeling we’re going to look back and be really confused by quite a bit. There’s probably a lot more than this, but I had to stop somewhere.

22 Things Education Does That Will Embarrass Us In 25 Years

1. We separated literacy from content.
And were confused when we couldn’t properly untangle them.

2. Meter progress by grade levels.
Right now, progress through academia is incremental, like inches on a ruler. These increments are marked by “grade levels,” which really has no meaning other than the artificial one schools have given it in the most self-justifying, circular argument ever.

3. We frowned upon crowdsourced content (e.g., Wikipedia)
Even though it has more updates and cross-checks than more traditional sources of info. It’s not perfect, but it’s the future. Err, present.

4. We gave vacations.
Why do we feel the need to provide months off at a time from learning to read, write, and think? We made school so bad that students couldn’t stand to do it without “vacations”? We cleaved it so cleanly from their daily lives that they “stopped” learning for months at a time?

5. We closed off schools from communities.
Which was the first (of many) errors. Then we let the media report on school progress under terms so artificially binary that we ended up dancing to the drum of newspaper headlines and political pressure.

6. We made it clumsy and awkward for teachers to share curriculum.
Seriously. How is there no seamless, elegant, and mobile way to do this?

7. We turned content into standards.
This makes sense until you realize that, by design, the absolute best this system will yield is students that know content.

8. We were blinded by data, research, and strategies….
..so we couldn’t see the communities, emotions, and habits that really drive learning.

9. We measured mastery once.
At the end of the year in marathon testing. And somehow this made sense? And performance on these tests gave us data that informed the very structures our schools were iterated with over time? Seriously? And we wonder why we chased our tails?

10. We spent huge sums of money on professional development.
While countless free resources floated around us in the digital ether. Silly administrators.

11. We reported progress with report cards.
Hey, I’ve tried other ways and parents get confused and downright feisty. We did a poor job helping parents understand what
grades really meant, and so they insisted on the formats they grew up with.

12. We banned early mobile technology (in this case, smartphones).
And did so for entirely non-academic reasons.

13. We shoehorned technology into dated learning models.
Like adding rockets to a tractor. Why did we not replace the tractor first?

14. We measured mastery with endless writing prompts and multiple-choice tests.
Which, while effective in spots, totally missed the brilliant students who, for whatever reason, never could shine on them.

15. We had parent conferences twice a year.
What? And still only had 15% of parents show up? And we didn’t completely freak out? We must’ve been really sleepy.

16. We ignored apprenticeships.
Apprenticeship is a powerful form of personalized learning that completely marries “content,” performance, craft, and
communities. But try having a 900 apprentices in a school. So much for that.

17. We claimed to “teach students to think for themselves.”
LOL

18. We often put 1000 or more students in the same school.
And couldn’t see how the learning could possibly become industrialized.

19. We frowned on lectures.
Even though that’s essentially what TED Talks are. Instead of making them engaging and interactive multimedia performances led by adults that love their content, we turned passionate teachers into clinical managers of systems and data.

20. We ignored social learning.
And got learning that was neither personal nor social. Curious.

21. We tacked on digital citizenship.
The definition of digital citizenship is “the quality of actions, habits, and consumption patterns that impact the ecology of digital content and communities.” This is artificial to teach outside of the way students use these tools and places on a daily basis–which makes hanging a “digital citizenship” poster or teaching a “digital citizenship” lesson insufficient.
Like literacy, it needs to be fully integrated into the learning experiences of students.

22. We turned to curriculum that was scripted and written by people thousands of miles away.
We panicked, and it was fool’s gold.

Bonus 23. We chewed teachers up and spit them out
We made teachers entirely responsible for planning, measuring, managing, and responding to both mastery and deficiency. And through peer pressure, a little brainwashing, and appealing to their pride, somehow convinced them they really were."
education  schools  teaching  howweteach  howwelearn  unschooling  deschooling  terryheick  literacy  content  curriculum  gradelevels  agesegregation  crowdsourcing  wikipedia  community  vacations  standards  standardization  preofessionaldevelopment  money  waste  bureaucracy  technology  edtech  mobile  phones  smartphones  criticalthinking  socialemotional  civics  citizenship  digitalcitizenship  social  learning  lectures  data  bigdata  quantification  apprenticeships  testing  standardizedtesting  assessment  fail  sharing  socialemotionallearning 
march 2015 by robertogreco
Mapping the Sneakernet – The New Inquiry
"Digital media travels hand to hand, phone to phone across vast cartographies invisible to Big Data"



"Indeed, the song was just one of many media files I saw on people’s phones: There were Chinese kung fu movies, Nigerian comedies, and Ugandan pop music. They were physically transferred, phone to phone, Bluetooth to Bluetooth, USB stick to USB stick, over hundreds of miles by an informal sneakernet of entertainment media downloaded from the Internet or burned from DVDs, bringing media that’s popular in video halls—basically, small theaters for watching DVDs—to their own villages and huts.

In geographic distribution charts of Carly Rae Jepsen’s virality, you’d be hard pressed to find impressions from this part of the world. Nor is this sneakernet practice unique to the region. On the other end of continent, in Mali, music researcher Christopher Kirkley has documented a music trade using Bluetooth transfers that is similar to what I saw in northern Uganda. These forms of data transfer and access, though quite common, are invisible to traditional measures of connectivity and Big Data research methods. Like millions around the world with direct internet connections, young people in “unconnected” regions are participating in the great viral products of the Internet, consuming mass media files and generating and transferring their own media.

Indeed, the practice of sneakernets is global, with political consequences in countries that try to curtail Internet access. In China, I saw many activists trading media files via USB sticks to avoid stringent censorship and surveillance. As Cuba opens its borders to the world, some might be surprised that citizens have long been able to watch the latest hits from United States, as this Guardian article notes. Sneakernets also apparently extend into North Korea, where strict government policy means only a small elite have access to any sort of connectivity. According to news reports, Chinese bootleggers and South Korean democracy activists regularly smuggle media on USB sticks and DVDs across the border, which may be contributing to increasing defections, as North Korean citizens come to see how the outside world lives.

Blum imagines the Internet as a series of rivers of data crisscrossing the globe. I find it a lovely visual image whose metaphor should be extended further. Like water, the Internet is vast, familiar and seemingly ubiquitous but with extremes of unequal access. Some people have clean, unfettered and flowing data from invisible but reliable sources. Many more experience polluted and flaky sources, and they have to combine patience and filters to get the right set of data they need. Others must hike dozens of miles of paved and dirt roads to access the Internet like water from a well, ferrying it back in fits and spurts when the opportunity arises. And yet more get trickles of data here and there from friends and family, in the form of printouts, a song played on a phone’s speaker, an interesting status update from Facebook relayed orally, a radio station that features stories from the Internet.

Like water from a river, data from the Internet can be scooped up and irrigated and splashed around in novel ways. Whether it’s north of the Nile in Uganda or south of Market St. in the Bay Area, policies and strategies for connecting the “unconnected” should take into account the vast spectrum of ways that people find and access data. Packets of information can be distributed via SMS and mobile 3G but also pieces of paper, USB sticks and Bluetooth. Solar-powered computer kiosks in rural areas can have simple capabilities for connecting to mobile phones’ SD cards for upload and download. Technology training courses can start with a more nuanced base level of understanding, rather than assuming zero knowledge of the basics of computing and network transfer. These are broad strokes, of course; the specifics of motivation and methods are complex and need to be studied carefully in any given instance. But the very channels that ferry entertainment media can also ferry health care information, educational material and anything else in compact enough form.

There are many maps for the world’s internet tubes and the electric wires that power them, but, like any map, they reflect an inherent bias, in this case toward a single user, binary view of connectivity. This view in turn limits our understanding of just how broad an impact the Internet has had on the world, with social, political and cultural implications that have yet to be fully explored. One critical addition to understanding the internet’s global impact is mapping the many sneakernets that crisscross the “unconnected” parts of the world. The next billion, we might find, are already navigating new cities with Google Maps, trading Korean soaps and Nigerian comedies, and rocking out to the latest hits from Carly Rae Jepsen."
access  africa  internet  online  connectivity  2015  anxiaomina  bigdata  digital  maps  mapping  cartography  bias  sneakernets  p2p  peer2peer  uganda  music  data  bluetooth  mobile  phones  technology  computing  networks  northkorea  christopherkirkley  sms  communication  usb  andrewblum  sneakernet 
march 2015 by robertogreco
The Total Archive.
[See also: http://www.crassh.cam.ac.uk/events/25660

"The Total Archive: Dreams of Universal Knowledge from the Encyclopaedia to Big Data
19 March 2015 - 20 March 2015



The complete system of knowledge is a standard trope of science fiction, a techno-utopian dream and an aesthetic ideal. It is Solomon’s House, the Encyclopaedia and the Museum. It is also an ideology – of Enlightenment, High Modernism and absolute governance.

Far from ending the dream of a total archive, twentieth-century positivist rationality brought it ever closer. From Paul Otlet’s Mundaneum to Mass-Observation, from the Unity of Science movement to Isaac Asimov’s Encyclopedia Galactica, from the Whole Earth Catalog to Wikipedia, the dream of universal knowledge dies hard. These projects triumphantly burst their own bounds, generating more archival material, more information, than can ever be processed. When it encounters well defined areas – the sportsfield or the model organism – the total archive tracks every movement of every player, of recording every gene and mutation. Increasingly this approach is inverted: databases are linked; quantities are demanded where only qualities existed before. The Human Genome Project is the most famous, but now there are countless databases demanding ever more varied input. Here the question of what is excluded becomes central.

The total archive is a political tool. It encompasses population statistics, GDP, indices of the Standard of Living and the international ideology of UNESCO, the WHO, the free market and, most recently, Big Data. The information-gathering practices of statecraft are the total archive par excellence, carrying the potential to transfer power into the open fields of economics and law – or divest it into the hands of criminals, researchers and activists.

Questions of the total archive engage key issues in the philosophy of classification, the poetics of the universal, the ideology of surveillance and the technologies of information retrieval. What are the social structures and political dynamics required to sustain total archives, and what are the temporalities implied by such projects?

In order to confront the ideology and increasing reality of interconnected data-sets and communication technologies we need a robust conceptual framework – one that does not sacrifice historical nuance for the ability to speculate. This conference brings together scholars from a wide range of fields to discuss the aesthetics and political reality of the total archive."]
tumblr  classification  maps  knowledge  2015  tumblrs  archives  universality  collections  data  politics  bigdata  history  encyclopedias  paulotlet  mundaneum  isaacasimov  encyclopediagalactica  wholeearthcatalog  museums  ideology  highmodernism  sccifi  sciencefiction  humangenomeproject  libraries  wikipedia  universalknowledge 
march 2015 by robertogreco
Shoshanna Zuboff: Dark Google
"We witness the rise of a new absolute power. Google transfers its radical politics from cyberspace to reality. It will earn its money by knowing, manipulating, controlling the reality and cutting it into the tiniest pieces."



"If there is a single word to describe Google, it is „absolute.” The Britannica defines absolutism as a system in which „the ruling power is not subject to regularized challenge or check by any other agency.” In ordinary affairs, absolutism is a moral attitude in which values and principles are regarded as unchallengeable and universal. There is no relativism, context-dependence, or openness to change.

Six years ago I asked Eric Schmidt what corporate innovations Google was putting in place to ensure that its interests were aligned with its end users. Would it betray their trust? Back then his answer stunned me. He and Google’s founders control the super-voting class B stock. This allows them, he explained, to make decisions without regard to short-term pressure from Wall Street. Of course, it also insulates them from every other kind of influence. There was no wrestling with the creation of an inclusive, trustworthy, and transparent governance system. There was no struggle to institutionalize scrutiny and feedback. Instead Schmidt’s answer was the quintessence of absolutism: „trust me; I know best.” At that moment I knew I was in the presence of something new and dangerous whose effects reached beyond narrow economic contests and into the heart of everyday life."
ethics  google  surveillance  soshanazuboff  2014  business  politics  data  evil  bigdata  power  control  innovation  absolutism  ericschmidt  finance  capitalism  nsa  colonization  self-determination  reality  raykurzweil  europe 
december 2014 by robertogreco
Yeah, We're Really Screwing This Up | Just Visiting @insidehighered
"It’s not coincidental that the sorts of places that allow students to make mistakes in the name of exploration, Harvard, Yale, Stanford, et al, will not be putting these algorithms to use because the university as paternalistic surveillance state isn’t consistent with genuine student welfare.

Freedom’s just another word for something that’s too expensive for anyone other than the wealthy to possess.

The chief beneficiaries of the surveillance state university will be the corporations (some of which will be housed inside the universities) that stand to make billions selling these tech-based “solutions[5]” to problems we’ve created because we refuse to see education as a collective endeavor, because we refuse to see education as a public good.

To my eye, Blumenstyk describes a dystopia. That she seems to be urging us to move towards it makes me wonder if I’m inside a nightmare.

A significant portion of my students fear failure even when there are no stakes. I can’t imagine what happens when they receive their daily or hourly or even real-time alerts on their academic progress sent to their biofeedback sensors[6].

Will I have students jolting awake in class when those sensors detect a slowing of respiration and pulse?

Paging Professor Pavlov.

Maybe I too can be assessed using the data. We shall conduct class inside giant MRI machines that measure our brain activity. The brighter the lights on the scan, the better the instructor.

RateMyProfessor better get on this, lest they be left behind.

--
I have a different idea to the current crisis: Let’s lower the stakes.

Let’s make education affordable.

Let’s have a society where decent-paying jobs are available to people without four-year degrees.

Let’s pay all faculty a wage that allows them to do the kind of work that makes a difference in students’ lives.

I will never understand why we subject students to treatment we would never accept for ourselves.
And yet, here we are."

[via: http://tinyletter.com/audreywatters/letters/hack-education-weekly-newsletter-no-89 ]
2014  johnwarner  via:audreywatters  surveillance  highered  highereducation  freedom  inequality  control  quantification  highstakes  education  learning  howweteach  howwelearn  scriptedlearning  affordability  society  bigdata  goldieblumenstyk 
december 2014 by robertogreco
Convivial Tools in an Age of Surveillance
"What would convivial ed-tech look like?

The answer can’t simply be “like the Web” as the Web is not some sort of safe and open and reliable and accessible and durable place. The answer can’t simply be “like the Web” as though the move from institutions to networks magically scrubs away the accumulation of history and power. The answer can’t simply be “like the Web” as though posting resources, reference services, peer-matching, and skill exchanges — what Illich identified as the core of his “learning webs” — are sufficient tools in the service of equity, freedom, justice, or hell, learning.

“Like the Web” is perhaps a good place to start, don’t get me wrong, particularly if this means students are in control of their own online spaces — its content, its data, its availability, its publicness. “Like the Web” is convivial, or close to it, if students are in control of their privacy, their agency, their networks, their learning. We all need to own our learning — and the analog and the digital representations or exhaust from that. Convivial tools do not reduce that to a transaction — reduce our learning to a transaction, reduce our social interactions to a transaction.

I'm not sure the phrase "safe space" is quite the right one to build alternate, progressive education technologies around, although I do think convivial tools do have to be “safe” insofar as we recognize the importance of each other’s health and well-being. Safe spaces where vulnerability isn’t a weakness for others to exploit. Safe spaces where we are free to explore, but not to the detriment of those around us. As Illich writes, "A convivial society would be the result of social arrangements that guarantee for each member the most ample and free access to the tools of the community and limit this freedom only in favor of another member’s equal freedom.”

We can’t really privilege “safe” as the crux of “convivial” if we want to push our own boundaries when it comes to curiosity, exploration, and learning. There is risk associated with learning. There’s fear and failure (although I do hate how those are being fetishized in a lot of education discussions these days, I should note.)

Perhaps what we need to build are more compassionate spaces, so that education technology isn’t in the service of surveillance, standardization, assessment, control.

Perhaps we need more brave spaces. Or at least many educators need to be braver in open, public spaces -- not brave to promote their own "brands" but brave in standing with their students. Not "protecting them” from education technology or from the open Web but not leaving them alone, and not opening them to exploitation.

Perhaps what we need to build are more consensus-building not consensus-demanding tools. Mike Caulfield gets at this in a recent keynote about “federated education.” He argues that "Wiki, as it currently stands, is a consensus *engine*. And while that’s great in the later stages of an idea, it can be deadly in those first stages.” Caulfield relates the story of the Wikipedia entry on Kate Middleton’s wedding dress, which, 16 minutes after it was created, "someone – and in this case it probably matters that is was a dude – came and marked the page for deletion as trivial, or as they put it 'A non-notable article incapable of being expanded beyond a stub.’” Debate ensues on the entry’s “talk” page, until finally Jimmy Wales steps in with his vote: a “strong keep,” adding "I hope someone will create lots of articles about lots of famous dresses. I believe that our systemic bias caused by being a predominantly male geek community is worth some reflection in this context.”

Mike Caulfield has recently been exploring a different sort of wiki, also by Ward Cunningham. This one — called the Smallest Federated Wiki — doesn’t demand consensus like Wikipedia does. Not off the bat. Instead, entries — and this can be any sort of text or image or video, it doesn’t have to “look like” an encyclopedia — live on federated servers. Instead of everyone collaborating in one space on one server like a “traditional” wiki, the work is distributed. It can be copied and forked. Ideas can be shared and linked; it can be co-developed and co-edited. But there isn’t one “vote” or one official entry that is necessarily canonical.

Rather than centralized control, conviviality. This distinction between Wikipedia and Smallest Federated Wiki echoes too what Illich argued: that we need to be able to identify when our technologies become manipulative. We need "to provide guidelines for detecting the incipient stages of murderous logic in a tool; and to devise tools and tool systems that optimize the balance of life, thereby maximizing liberty for all."

Of course, we need to recognize, those of us that work in ed-tech and adopt ed-tech and talk about ed-tech and tech writ large, that convivial tools and a convivial society must go hand-in-hand. There isn’t any sort of technological fix to make education better. It’s a political problem, that is, not a technological one. We cannot come up with technologies that address systematic inequalities — those created by and reinscribed by education— unless we are willing to confront those inequalities head on. Those radical education writers of the Sixties and Seventies offered powerful diagnoses about what was wrong with schooling. The progressive education technologists of the Sixties and Seventies imagined ways in which ed-tech could work in the service of dismantling some of the drudgery and exploitation.

But where are we now? Instead we find ourselves with technologies working to make that exploitation and centralization of power even more entrenched. There must be alternatives — both within and without technology, both within and without institutions. Those of us who talk and write and teach ed-tech need to be pursuing those things, and not promoting consumption and furthering institutional and industrial control. In Illich’s words: "The crisis I have described confronts people with a choice between convivial tools and being crushed by machines.""
toolforconviviality  ivanillich  audreywatters  edtech  technology  education  2014  seymourpapert  logo  alankay  dynabook  mikecaufield  wardcunningham  web  internet  online  schools  teaching  progressive  wikipedia  smallestfederatedwiki  wikis  society  politics  policy  decentralization  surveillance  doxxing  gamergate  drm  startups  venturecapital  bigdata  neilpostman  paulofreire  paulgoodman  datapalooza  knewton  computers  computing  mindstorms  control  readwrite  everettreimer  1960s  1970s  jonathankozol  disruption  revolution  consensus  safety  bravery  courage  equity  freedom  justice  learning 
november 2014 by robertogreco
The Pitfalls of Productivity - NYTimes.com
"There’s also the question of who really benefits when workers get more done. Mr. Poole writes critically of companies’ productivity initiatives:

“The latest wheeze is the Big Data field of ‘workforce science,’ in which everything – patterns of emails, the length of telephone calls — may be measured and consigned to a comparative database to create a perfect management panopticon. It is tempting to suspect that the ambition thus to increase ‘worker productivity’ is aimed at getting more work out of each employee for the same (or less) money.”

And while workers who get more productive may initially see raises or promotions, the labor historian Nelson Lichtenstein told Op-Talk, companies will soon come to expect that higher level of productivity from everybody: “over time, and not very much time, the corporation will say ‘this is the new work norm.’” This has already happened, he added, with the expectation that workers be reachable around the clock. A better approach, he said, would be to improve job protections and stability, since workers are actually more productive when their employment is more secure.

For Mr. Bailey, though, productivity doesn’t necessarily mean working more at your job: “I think everybody has a different reason for wanting to become more productive, and I think you should figure that out before you invest in your productivity,” he said. “I think of productivity as way to accomplish more meaningful things in a short amount of time, so you can make more time for the things that are actually important to you.”

And Dr. Gregg suggested that the systems we use to organize our work could be used to bring us together rather than to drive us apart. “I would like to encourage a kind of mindfulness that is less individual and more collective,” she said. Her hope for productivity apps and other technologies is that “they’ll allow us to have a better conversation about collective work practices, and what are the conditions that individuals feel that they need to get done what’s being asked of them in the workplace.”

“Mindfulness can also mean being mindful of others,” she said, “and that’s really the collective labor tradition that I would like to see continue.”"
gtd  gettingthingsdone  productivity  busyness  2014  annanorth  chrisbailey  stevenpoole  frederickwinslowtaylor  efficiency  melissagregg  slow  taylorism  jessicalamb-shapiro  bigdata  nelsonlichtenstein  mindfulness  labor  work  capitalism  industrialization 
october 2014 by robertogreco
Smart Stories: Matt Adams (Future of StoryTelling 2014) on Vimeo
"Artist Matt Adams’s work playfully explores the storytelling potential of new technologies. His present fascination is big data. How will stories be influenced by our ability to learn personal details about our audiences? What are the limits of personalization?"

[Also on YouTube: https://www.youtube.com/watch?v=JxxEy78EqYI ]

[See also: http://www.blasttheory.co.uk/news-item/matt-adams-will-be-speaking-at-the-fost-summit/
http://www.blasttheory.co.uk/?wysija-page=1&controller=email&action=view&email_id=53&wysijap=subscriptions&user_id=5691 ]
mattadams  storytelling  technology  blastheory  her  karen  data  bigdata  mobile  2014  interactivefiction  games  play  gaming  personalization  culture  psychology  interactive  if 
october 2014 by robertogreco
Shoshan Zuboff on “Big Data” as Surveillance Capitalism
"VII. HOW TO CONSTRUCT A FUTURE THAT WE CAN CALL HOME

Why is it that the declaration of surveillance capitalism has met so little resistance? Searle’s reasoning is a good guide. Agreement? Yes, there were and are plenty of people who think surveillance capitalism is a reasonable business model. (We’ll have to leave why they think so to another discussion.) Authority? Yes. The tech leaders have been imbued with the authority of expertise and idolized as entrepreneurs. Persuasion? Absolutely. All the neoliberal buzzwords of entrepreneurialism, creative destruction, disruption, etc. persuaded many that these developments were right and necessary. A quid pro quo? Yes, powerfully so. The new free services of search and connection were exactly what we needed and have become essential to social participation. When Facebook went down last month, a lot of Americans called 911 (emergency services).

Was there any use of force or other means to foreclose alternatives? No military force was needed. Instead, as the new logic became the dominant business model for online companies and start-ups, it spawned millions of related institutionalized facts— ancillary and intermediary business services, professional specializations, new language, IPOs, tons of cash, network effects, unprecedented concentrations of information power. All these limit our sense that there can be any alternative. And finally, how about a lack of understanding? This is the most salient reason of all. Most people did not and could not appreciate the extent to which the new “facts” depended upon surveillance. This colossal asymmetry of understanding helps explain why Edward Snowden was necessary. Somebody had to be Ed Snowden

What kind of resistance has been offered and why has it failed to stop the spread of surveillance capitalism? Here I depart from Searle in order to introduce two distinct varieties of declaration that I think can help us understand more about how the future unfolds. I suggest that the kind of resistance that has been offered so far takes the form of what I call the “counter-declaration.” A counter-declaration is defensive. It addresses the institutional facts asserted by the declaration. The process of countering seeks to impose constraints or achieve compromise, but it does not annihilate the contested fact. In addressing those facts, it invariably increases their power. Negotiation inevitably legitimates the other. This is why many governments refuse to negotiate with terrorists. As Searle noted, even talking about something or referring to it increases its reality by treating it as a thing that is already real. It’s a classic quick-sand situation in that the more you fight it, the more it sucks you in.

What are examples of counter-declarations? Google and other Internet companies have been the targets of many privacy-related lawsuits. Some of these efforts have imposed real constraints, such as prohibiting Google Street View cars to extract personal data from computers inside homes, or the class action that resulted in Facebook’s suspension of its invasive “Beacon” program. Legal actions like these can limit certain practices for a time, but they do not topple the institutionalized facts of surveillance capitalism in the target or other companies. Encryption is another counter-declaration. When we encrypt, we acknowledge the reality of the thing we are trying to evade. Rather than undoing that reality, encryption ignites an arms race with the very thing it disputes. Privacy tools like “opt out” or “do not track” are another example. When I click on “do not track,” what I am really saying is “do not track me.” My choice does not stop the company from tracking everyone else.

I want to be clear that I am not critical of counter-declarations. They are necessary and vital. We need more of them. But the point I do want to make is that counter-declarations alone will not stop this train. They run a race that they can never win. They may lead to a balance of power, but they will not in and of themselves construct an alternative to surveillance capitalism.

What will enable us to move forward in a new way? As I see it, we will have to move on to a new kind of declaration that I am calling a “synthetic declaration.” By this I mean a declaration that synthesizes the opposing facts of declaration and counter-declaration. It arises from— and draws to it —new and deeper wellsprings of collective intentionality. It asserts an original vision. If the counter-declaration is check, the synthetic declaration is checkmate.

Does information capitalism have to be based on surveillance. No. But surveillance capitalism has emerged as a leading version of information capitalism. We need new synthetic declarations to define and support other variants of information capitalism that participate in the social order, value people, and reflect democratic principles. New synthetic declarations can provide the framework for a new kind of double movement appropriate to our time.

Are there examples? There are glimmers. The past year brought us Ed Snowden, who asserted a new reality at great personal sacrifice by claiming this to be a world in which the information he provided should be shared information. Wikileaks has also operated in this spirit. The EU Court’s decision on the right to be forgotten points in the direction of a synthetic declaration by establishing new facts for the online world. (In my view, it also faltered, perhaps inadvertently, by also establishing new facts that grant Google inappropriate new powers.

Mathias Doepfner’s open letter to Google chairperson Eric Schmidt, published in FAZ last spring, called for a synthetic declaration in the form of a unique European narrative of the digital, one that is not subjugated to the institutional facts asserted by the Internet giants.

Indeed, I think it can be said that the German people are now drawing on their unique historical experience to produce their own synthetic declaration that insists on a different kind of digital future. Note that The Economist just published an article titled “Googlephobia in Germany.” The aim of such language is to suggest that it’s neurotic and therefore irrational to oppose Google’s practices. It’s a classic counter-declaration that reveals the powerful effect of Germany’s new thinking. The real fear is that Germany might produce a synthetic declaration that opens a space for alternative forms of information capitalism to flourish.

I am mindful of a long list of demands that were damned as “neurotic” and unreasonable in America a century ago, as the contest over 20th century capitalism accelerated: labor unions, a living wage, business regulation, racial equality, womens’ right to vote, a high school education…. For anyone who thinks Germany’s concerns are “phobic,” one need only recall the revelations less than a year ago that the NSA was spying on Joaquin Almunia, the EU official who presides over the Google antitrust case. Or the recently published emails that provide fresh glimpses of the collaborative relationship between the NSA and Google. And should we mention that Google’s chairperson, Schmidt, also sits on the board of the Economist Group?

Our world sorely needs more —and more comprehensive—synthetic declarations that point us in a wholly new direction. We need new facts that assert the primacy of humanity, the dignity of the person, the bonds of democratic community strengthened by individual empowerment and knowledge, and the well being of our planet. This does not mean that we should construct utopias. Rather, it means that we should draw upon the authentic promise of the digital— the promise that we grasped before Ed Snowden entered history.

In the shadow and gloom of today’s institutional facts, it has become fashionable to mourn the passing of the democratic era. I say that democracy is the best our species has created so far, and woe to us if we abandon it now. The real road to serfdom is to be persuaded that the declarations of democracy we have inherited are no longer relevant to a digital future. These have been inscribed in our souls, and if we leave them behind— we abandon the best part of ourselves. If you doubt me, try living without them, as I have done. That is the real wasteland, and we should fear it."
soshanazuboff  via:steelemaley  2014  bigdata  declarations  internet  web  online  edwardsnowden  joaquinalmunia  hannaharendt  hamesburnham  frankschirrmacher  germany  europe  advertising  capitalism  surveillancecapitalism  surveillance  privacy  democracy  counterdeclarations  feedom  courage  law  legal  dataexhaust  data  datamining  google 
september 2014 by robertogreco
PLAY Stories: An Interview With John Marshall
"John Marshall's latest project (with rootoftwo) is a weather vane built for the 21st Century: a headless chicken that tracks and responds to Internet “fear levels”. Five of these Whithervanes are installed on the highest points of five buildings in Folkestone, UK for the 2014 Folkestone Triennial (30 August – 2 November)."



"The chickens are four feet tall and made of polyurethane foam coated in polyester resin. Each is controlled by a credit-card sized computer that connects to the Internet and listens in real-time to news reports uploaded by journalists from around the world.

When a report comes in, the computer reads it and works out the GPS coordinates where the event happened. It then calculates the direction and distance of the event from Folkestone. The computer then reads the rest of the report, cross-checking the text with the list of keywords and phrases the Department of Homeland Security uses to monitor social networking sites for terrorist threats. The computer also looks for keywords and phrases gathered in a series of workshops we did with the people of Folkestone about what they are afraid of. The keyword list includes threats as diverse as: race riots, gastro tourists, unemployment and dog poo.

The intensity of fear is indicated by changing colored lighting and the number of spins each chicken makes. There are five levels of fear: 1. Low (Green), 2. Guarded (Blue), 3. Elevated (Yellow), 4. High (Orange) and 5. Severe (Red) - the same as the Homeland Security National Terrorism Advisory System. The five chickens revolve away from the location of each news story."



"Every Whithervane has the same list of keywords and phrases, but each has a unique "score" associated with the terms that reflect the aggregate values of the people that live in each neighborhood where the chickens are located. The "scores" have been weighted using marketing tools based on UK census data that are typically used for targeting junk mail. The computer does a calculation that considers the level of fear in the story for the local population and the distance of the event from Folkestone. For example, the same story about immigration from the European Union will have a different level of fear for different neighborhoods. Folkestone is the first point of entry in the UK for visitors arriving via the Channel Tunnel - this makes for some very complicated local opinions.

The public can also influence the individual Whithervanes by Tweeting to @whithervanes #keepcalm (to reduce) or #skyfalling (to increase) the ambient fear level in the system. If they don't have a Twitter account we have built a website where you can submit a Tweet by clicking a button. There are public access terminals in the Triennial visitor's center in Folkestone."
johnmarshall  art  weathervanes  internet  fear  2014  news  twitter  rootoftwo  whitervanes  python  raspberrypi  projectideas  bigdata 
august 2014 by robertogreco
The Problem with “Personalization”
"What are the repercussions of radically “personalizing” education through computers? What do we gain? What do we lose?

There’s a very powerful strain of American individualism — and California exceptionalism — that permeates technology: an emphasis on personal responsibility, self-management, autonomy. All that sounds great when and if you frame new technologies in terms of self-directed learning.

But how do we reconcile that individualism with the social and political and community development that schools are also supposed to support? How do we address these strains of individualism and increasingly libertarianism as they permeate the classroom?

What do we do about the communal goals of education, for example — to produce good citizens, if nothing else — if we become maniacally focused on personal goals of education instead? What happens to meaningful moments to collaborate? What happens to discussion? What happens to debate? What happens to the idea that we must work through ideas together — not just in the classroom, but as part of our work and civic responsibilities?

And who gets the “personalized” education delivered through them via adaptive technology? And who gets the “personalization” that we hope a student-centered, progressive education would offer?

This image from a PBS documentary about Rocketship Education haunts me.

The chain of charter schools boasts personalization — “Rocketship uses the most adaptive and personalized programs available, and continues to push Silicon Valley vendors and others to create even more adaptive learning tools,” its website boasts.

So the problem with personalization via adaptive software isn’t simply that “it doesn’t work.” It’s that it might work — work to obliterate meaningful and powerful opportunities for civics, for connection, for community. Work to obliterate agency for students. And work not so much to accelerate learning, but to accelerate educational inequalities."

[Accompanies: "What Should School Leaders Know About Adaptive Learning?" https://modernlearners.com/what-should-school-leaders-know-about-adaptive-learning/ ]

[See also: http://thesprouts.org/blog/rendering-learners-legible ]
rocketshipschools  audreywatters  education  personalization  bigdata  legibility  autonomy  personallearning  learning  schools  policy  adaptivelearningtechnology  data  datacollection  adaptivelearning  adaptivetechnology 
june 2014 by robertogreco
Jeremy Rifkin: "The Zero Marginal Cost Society" | Authors at Google - YouTube
"In The Zero Marginal Cost Society, New York Times bestselling author Jeremy Rifkin describes how the emerging Internet of Things is speeding us to an era of nearly free goods and services, precipitating the meteoric rise of a global Collaborative Commons and the eclipse of capitalism.

Rifkin uncovers a paradox at the heart of capitalism that has propelled it to greatness but is now taking it to its death—the inherent entrepreneurial dynamism of competitive markets that drives productivity up and marginal costs down, enabling businesses to reduce the price of their goods and services in order to win over consumers and market share. (Marginal cost is the cost of producing additional units of a good or service, if fixed costs are not counted.) While economists have always welcomed a reduction in marginal cost, they never anticipated the possibility of a technological revolution that might bring marginal costs to near zero, making goods and services priceless, nearly free, and abundant, and no longer subject to market forces.

Now, a formidable new technology infrastructure—the Internet of things (IoT)—is emerging with the potential of pushing large segments of economic life to near zero marginal cost in the years ahead. Rifkin describes how the Communication Internet is converging with a nascent Energy Internet and Logistics Internet to create a new technology platform that connects everything and everyone. Billions of sensors are being attached to natural resources, production lines, the electricity grid, logistics networks, recycling flows, and implanted in homes, offices, stores, vehicles, and even human beings, feeding Big Data into an IoT global neural network. Prosumers can connect to the network and use Big Data, analytics, and algorithms to accelerate efficiency, dramatically increase productivity, and lower the marginal cost of producing and sharing a wide range of products and services to near zero, just like they now do with information goods.

Rifkin concludes that capitalism will remain with us, albeit in an increasingly streamlined role, primarily as an aggregator of network services and solutions, allowing it to flourish as a powerful niche player in the coming era. We are, however, says Rifkin, entering a world beyond markets where we are learning how to live together in an increasingly interdependent global Collaborative Commons. --macmillan.com

About the Author: Jeremy Rifkin is the bestselling author of twenty books on the impact of scientific and technological changes on the economy, the workforce, society, and the environment. He has been an advisor to the European Union for the past decade.

Mr. Rifkin also served as an adviser to President Nicolas Sarkozy of France, Chancellor Angela Merkel of Germany, Prime Minister Jose Socrates of Portugal, Prime Minister Jose Luis Rodriguez Zapatero of Spain, and Prime Minister Janez Janša of Slovenia, during their respective European Council Presidencies, on issues related to the economy, climate change, and energy security.

Mr. Rifkin is a senior lecturer at the Wharton School's Executive Education Program at the University of Pennsylvania where he instructs CEOs and senior management on transitioning their business operations into sustainable Third Industrial Revolution economies.

Mr. Rifkin holds a degree in economics from the Wharton School of the University of Pennsylvania, and a degree in international affairs from the Fletcher School of Law and Diplomacy at Tufts University."
socialcommons  cooperatives  2014  jeremyrifkin  internetofthings  zeromarginalcostsociety  society  economics  sharing  sharingeconomy  consumers  prosumers  marginalcosts  markets  collaborativecommons  collaboration  capitalism  bigdata  analytics  efficiency  technology  abundance  commons  exchange  networks  qualityoflife  climatechange  google  geopolitics  biosphereconsciousness  cyberterrorism  biosphere  iot 
april 2014 by robertogreco
This Woman's Online Heartbeat Will Make You Think About Big Data And The Quantified Self | Co.Exist | ideas + impact
"While we look for different ways to access our own data, whether it be in the form of wearables or different mobile apps, we also might not have control of how that information is being used. Lowe takes issue with the fact that a lot of conversations about those ethical standards are limited to the academy, or other silos of privilege. By posting her heartbeat online, Lowe feels she's taking ownership of her information in a small, but hopefully significant, way.

"I'm lucky that the Internet gives me the power to make some little thing that's different from the rest. If we're going to talk about alternative models of how we might interact with our data, it helps to have examples," she writes. "I'm really motivated by getting people to question what data we record, and who records it, and why.""
2014  jenlowe  data  bigdata  onehumanheartbeat 
april 2014 by robertogreco
Usman Haque: 'Messiness will inevitably arise' in spite of smart cities' (Wired UK)
"In the smart-city equivalent -- "Grub City" -- I see citizens mocking the homogenising of static urban data infrastructures and rejecting their bids to handle cities' "super wicked" messes through reductivist approaches to data. What we decide to measure, how we decide to measure, and why we decide to measure -- these questions are vital for Grub City citizens, who craft and perform data "badly" and "messily", because that enables invention unanticipated by planners. 

Grub City citizens recognise it's through the activity of measurement, not passive interpreting of data, that we understand our environment; that we build up intuitions about how we affect it; and through which we develop our own standards of evidence. It's the ensuing heterogeneity of understandings, explanations and attempts to control (as well as the heterogeneity of goals implied) that is essential for any sustainable model of city-making. New technologies help us do this not "better" but "differently". We will see contradictions, for even collaboration does not need consensus. But no matter what attempts are made to impose order and predictability on cities of the near future, a messiness will inevitably arise. 

Long live Grub City!"
usmanhaque  cities  smartcities  urbanism  bigdata  measurement  urban  data  messiness  grubcity  planning  unplanning 
july 2013 by robertogreco
Strata Summit 2011: Marc Goodman, "The Business Of Illegal Data..." - YouTube
"While businesses around the world struggle to understand the how to profit from the information revolution, one class of enterprise has successfully mastered the challenge—international organized crime. Globally crime groups are rapidly transforming themselves into consumers of big data. Lessons in how organized crime and terrorists are innovatively consuming both illegal and open source data will be presented."
data  bigdata  crime  organizedcrime  2011  via:timcarmody  information  marcgoodman  datamining 
july 2013 by robertogreco
You Can't Just Hack Your Way to Social Change - Jake Porway - Harvard Business Review
"Any data scientist worth their salary will tell you that you should start with a question, NOT the data. Unfortunately, data hackathons often lack clear problem definitions. Most companies think that if you can just get hackers, pizza, and data together in a room, magic will happen. This is the same as if Habitat for Humanity gathered its volunteers around a pile of wood and said, "Have at it!" By the end of the day you'd be left with a half of a sunroom with 14 outlets in it."
data  bigdata  jakeporway  2013  hacking  hackathons  problemsolving  framing  questionasking  askingquestions 
march 2013 by robertogreco
How Facebook could get you arrested | Technology | The Observer
"The promise of predictive policing might be real, but so are its dangers. The solutionist impulse needs to be restrained. Police need to subject their algorithms to external scrutiny and address their biases. Social networking sites need to establish clear standards for how much predictive self-policing they'll actually do and how far they will go in profiling their users and sharing this data with police. While Facebook might be more effective than police in predicting crime, it cannot be allowed to take on these policing functions without also adhering to the same rules and regulations that spell out what police can and cannot do in a democracy. We cannot circumvent legal procedures and subvert democratic norms in the name of efficiency alone."
facebook  police  surveillance  ethics  bigdata  legal  law  democracy  justice  policing  solutionism  security  2013  evgenymorozov  socialnetworking  technology  internet  web 
march 2013 by robertogreco
a beginners guide to streamed data from Twitter (tecznotes)
"This is a brief guide on using the Twitter live streaming API, extracting useful data from it, and converting that data into a spreadsheet-ready text form, all using tools available on Mac OS X by default. There’s also a brief Python tutorial for scrubbing basic data buried in here someplace."
2012  michalmigurski  howto  bigdata  streaming  data  python  api  twitter 
september 2012 by robertogreco
Maps, Maps And MOAR Maps At The Society Of Cartographers And Expedia | Gary's Bloggage
"History has a habit of repeating itself and so does the map. From primitive scratchings, through ever more sumptuous pieces of art, through to authoritative geographical representations, the map changes throughout history. Maps speak of the hopes, dreams and prejudices of their creators and audience alike, and with the advent of neogeography and neocartography, maps are again as much art as they are geographical information.

... will that do?"
noaa  bigdata  data  exploration  aaronstraupcope  flickr  googlemaps  bingmaps  agi  osm  openstreetmap  yahoo  nokia  geography  stamen  mattbiddulph  garygale  2012  history  neocartography  mapping  maps 
september 2012 by robertogreco
Mapmaker, Artist, or Programmer? - Arts & Lifestyle - The Atlantic Cities
"Ultimately, almost everything I have been making tries to take the dim, distant glimpse of the real world that we can see through data and magnify some aspect of it in an attempt to understand something about the structure of cities," he says. "I don't know if that comes through at all in the actual products, but it is what they are all building toward."

The 39-year-old Fischer, who lives in Oakland, developed his cartographic interest while at the University of Chicago, when he came across the windy city's 1937 local transportation plan. (It was a "clearly insane plan" to replace the transit system with a massive freeway network, he recalls.) Until a few weeks ago Fischer worked as a programmer at Google, gathering the data that guides his projects in his spare time.
twitter  flickr  exploratorium  chicago  sanfrancisco  transportation  dataviz  transit  bigdata  urbanism  urban  discovery  geolocation  geotagging  ericjaffe  cities  google  datavisualization  datavis  data  interviews  2012  mapping  maps  ericfischer 
august 2012 by robertogreco
Everything You've Heard About Failing Schools Is Wrong
"Overall, the last 10 years have revealed that while Big Data can make our questions more sophisticated, it doesn't necessarily lead to Big Answers. The push to improve scores has left behind traditional assessments that, research indicates, work better to gauge performance…

Even the godfather of standardized testing, the cognitive psychologist Robert Glaser [26], warned in 1987 about the dangers of placing too much emphasis on test scores. He called them "fallible and partial indicators of academic achievement" and warned that standardized tests would find it "extremely difficult to assess" the key skills people should gain from a good education: "resilience and courage in the face of stress, a sense of craft in our work, a commitment to justice and caring in our social relationships, a dedication to advancing the public good.""

"A look at Maria's schoolwork, on the other hand, is a glimpse at a learner's progress. …"
standardization  standards  commoncore  publicschools  history  tfa  wendykopp  billgates  michellerhee  latinos  immigration  learning  sanfrancisco  missionhigh  bigdata  education  policy  robertglaser  assessment  standardizedtesting  rttt  nclb  kristinarizga  2012  teachforamerica 
august 2012 by robertogreco
The Danger of Big Data: Social Media as Computational Social Science
"This paper begins with a consideration of the nature and risks of computational social science, followed by a focus on social media platforms as social science tools. We then discuss the aggregation of data and the expansion of computational social science along both horizontal and vertical axes. We consider the problems aggregation has raised in past social science research, as well as the potential problems raised by the use of social media as computational social science by business customers, government, platform providers and platform users; this discussion includes consideration of consumer protection, ethical codes, and civil liberty impacts. We end by highlighting the richness of social media data for computational social science research and the need to ensure this data is used ethically and the public is protected from abuse. The danger today is that computational social science is being used opaquely and near ubiquitously, without recognition or regard for the past debate on ethical social science experimentation."
digitalhumanities  bloggable  via:ayjay  bigdata  socialscience 
july 2012 by robertogreco
To Know, but Not Understand: David Weinberger on Science and Big Data - David Weinberger - Technology - The Atlantic
"Model-based knowing has many well-documented difficulties, especially when we are attempting to predict real-world events subject to the vagaries of history; a Cretaceous-era model of that eras ecology would not have included the arrival of a giant asteroid in its data, and no one expects a black swan. Nevertheless, models can have the predictive power demanded of scientific hypotheses. We have a new form of knowing.

This new knowledge requires not just giant computers but a network to connect them, to feed them, and to make their work accessible. It exists at the network level, not in the heads of individual human beings."
modeling  modelessinnovation  models  2012  understanding  technology  epistemology  davidweinberger  knowledge  complexity  bigdata  data  science 
january 2012 by robertogreco
Of Data Scientists, Big Data, the City and Dancers « Rev Dan Catt's Blog
"Lefebvre…talks about rhythm of cities…flow of people, morning coffee routine, lunchtime decisions…

How people shape the city, the pulse as agents gather together to form a temporary autonomous zone before collapsing back to being shaped by the city. To be not just in the city, but of the city.

I’m not a fan of cities…can’t design for cities as I don’t understand them.

Lefebvre’s writing suggests that to analyze a city you need to have been consum/mat/ed by it.

…same as Big Data. You can’t have someone who’s a “Data Scientist” just turn up & apply tools, clusters & statistics…haven’t been in-it enough…can’t have someone who’s w/in company, understands & feels flow of data everyday, unless…they know how to separate themselves…get outside. When people grow w/ a company, love it, understand everything that it could be, getting outside it is a hard won skill. “Scientist” needs to be able to remove self & apply clear analytical skill, but w/ fundamental understanding of subject."
revdancatt  cities  fata  henrilefebvre  understanding  urban  urbanism  empathy  objectivity  bigdata  datascience  statistics  programming  2011 
june 2011 by robertogreco

Copy this bookmark:





to read