recentpopularlog in

robertogreco : sivavaidhyanathan   3

Aeon Ideas - Siva Vaidhyanathan on Has "innovation"...
"Progress was such a strong part of 18th century Enlightenment thought that the drafters of the U.S. Constitution instructed Congress “To promote the Progress of Science and useful Arts” via copyright and patent law. Today, those who advocate stronger intellectual property protection do so in the name of “innovation,” as do advocates of weaker intellectual property protection. Neither side argues for progress.

Progress is out-of-fashion. It’s almost embarrassing to invoke it. The grand mega-projects of the 20th century – dams, highways, national parks, public universities, and the occasional concentration camp – seem as distant from our current collective imaginations as pyramids and the Taj Mahal. The 1893 World’s Fair in Chicago was devoted to marking the technological progress since Columbus landed in the New World, and its grand exhibit featured the power of electricity. The 1939 World’s Fair in New York described “the World of Tomorrow.” And the 1964 World’s Fair in New York boasted of the imminent coming of world peace through “Man’s Achievement on a Shrinking Globe in an Expanding Universe.” We are still waiting for that."



"Innovation differs from progress in many ways. Innovation lacks a normative claim of significant betterment. It emerges from many small moves rather than grand, top-down schemes. Innovation does not contain an implication of a grand path or a grand design of a knowable future. It makes no claim on the future, except that it always exists in that future, just out of reach of the now. And innovation always seems to come from the distributed commercial world rather than from grand, planned policies engineered from a strong central state. States are now encouraged to innovate rather than solve big problems or correct for market failures. The ultimate goal of innovation seems to be more innovation.

“Innovation” is everywhere today. You can’t peruse a copy of the Harvard Business Review or Inc. Magazine without sliding by multiple uses of “innovation.” Universities large and small boast of new “innovation centers” and programs devoted to unleashing “innovation” in their students as well as their libraries and laboratories. Everyone is expected to innovate. Those who raise questions about the wisdom of a policy or technology are quickly dismissed as anti-innovation.

The use of “innovation” in published books, as measured by the Google NGrams project, surged in 1994, just as the Internet entered public life and the dot-com boom started. Harvard Business Professor Clayton Christensen’s use of the term “disruptive innovation” has dominated debates about management in both public and private sectors since the 1997 publication of his book The Innovators’ Dilemma. Even though historian Jill Lepore dismantled the poor logic and methodology of Christensen’s book in a 2014 New Yorker article, its influence has not waned. Innovation, especially the disruptive kind, has become a religious concept, immune to criticism."



"Modesty is a virtue, of course. And in many ways the focus on innovation instead of progress is refreshing. A belief that progress is definable and inevitable became hard to maintain by the turn of the 21st century as we confronted the efficient dehumanizing brutality of slavery, the Gulag, the Killing Fields, and the Final Solution.

Like with innovation, progress could be the locus of debate for two completely opposed policies. Historian David Brion Davis has written that appeals to progress were used by both advocates of expanding slavery and those who fought it. Davis identifies three aspects to the ideology of progress: A belief that historical change is not governed by chance or that historical events mesh in a meaningful pattern that reflects the principles of the natural sciences; A belief that the overall trajectory of history bends toward the better; And a prediction that the future will necessarily improve on the past, either through faith in human reason or divine providence.

Ecology has dealt perhaps the strongest blow to the ideology of progress. Ecological historians have charted the ways that humans have burned through their habitat, rendering much of it denatured and fragile, and improving crop yields through fossil-fuel-burning machinery and fossil-fuel-based fertilizers. So progress for some, such as an escape from the Malthusian trap of population outstripping food production, could mean devastation for all at some later date as the entire agricultural system collapses like an abandoned coal mine.

So perhaps the small-bore appeals to innovation are slightly preferable to the grand boasts of progress. But innovation can’t be our goal as a species. We actually do face grand problems like climate change, widespread abuse of people, threats to political liberty, and communicable diseases. It’s not unhealthy or even intellectually invalid to agree that we can and should make progress (with a very small “p”) against Ebola, HIV, rape, racism, and the melting of the polar ice caps.

Somewhere between the tiny vision of innovation and the arrogance of grand progress lies a vision of collective destiny and confidence that with the right investments, a strong consensus, and patience we can generate a more just and stable world. We passed right by that point in the rush from the 20th to the 21st centuries. We must reclaim it."
innovation  progress  via:anne  2015  sivavaidhyanathan 
june 2015 by robertogreco
We need to ditch generational labels – Rebecca Onion – Aeon
"Generational thinking is seductive and confirms preconceived prejudices, but it’s a bogus way to understand the world"



"But in real life, I find generational arguments infuriating. Overly schematised and ridiculously reductive, generation theory is a simplistic way of thinking about the relationship between individuals, society, and history. It encourages us to focus on vague ‘generational personalities’, rather than looking at the confusing diversity of social life. Since I’m a ‘Gen-X’er born in 1977, the conventional wisdom is that I’m supposed to be adaptable, independent, productive, and to have a good work/life balance. Reading these characteristics feels like browsing a horoscope. I see myself in some of these traits, and can even feel a vague thrill of belonging when I read them. But my ‘boomer’ mother is intensely productive; my ‘Greatest Generation’ grandmother still sells old books online at age 90, in what I consider to be the ultimate show of adaptability and independence.

enerational thinking doesn’t frustrate everyone. Indeed, there is a healthy market for pundits who can devise grand theories of generational difference. Neil Howe and William Strauss, authors of Generations: The History of America’s Future, 1584-2069 (1991) and founders of the consulting firm LifeCourse Associates in Virginia, have made a fine living out of generational assessments, but their work reads like a deeply mystical form of historical explanation. (Strauss died in 2007; Howe continues to run the consultancy LifeCourse.) The two have conceived an elaborate and totalising theory of the cycle of generations, which they argue come in four sequential and endlessly repeating archetypes.

In the Strauss-Howe schema, these distinct groups of archetypes follow each other throughout history thus: ‘prophets’ are born near the end of a ‘crisis’; ‘nomads’ are born during an ‘awakening’; ‘heroes’ are born after an ‘awakening’, during an ‘unravelling’; and ‘artists’ are born after an ‘unravelling’, during a ‘crisis’. Strauss and Howe select prominent individuals from each generation, pointing to characteristics that define them as archetypal – heroes are John F Kennedy and Ronald Reagan; artists: Theodore Roosevelt, Woodrow Wilson; prophets: John Winthrop, Abraham Lincoln; nomads: John Adams, Ulysses Grant. Each generation has a common set of personal characteristics and typical life experiences.

Plenty of kids at less-privileged schools weren’t intensely worried about grades or planning, like the stereotypical millennial

The archetypal scheme is also a theory of how historical change happens. The LifeCourse idea is that the predominance of each archetype in a given generation triggers the advent of the next (as the consultancy’s website puts it: ‘each youth generation tries to correct or compensate for what it perceives as the excesses of the midlife generation in power’). Besides having a very reductive vision of the universality of human nature, Strauss and Howe are futurists; they predict that a major crisis will occur once every 80 years, restarting the generational cycle. While the pair’s ideas seem far-fetched, they have currency in the marketplace: LifeCourse Associates has consulted for brands such as Nike, Cartoon Network, Viacom and the Ford Motor Company; for universities including Arizona State, Dartmouth, Georgetown and the University of Texas, and for the US Army, too.

The commercial success of this pseudoscientific mumbo-jumbo is irritating, but also troubling. The dominant US thinkers on the generational question tend to flatten social distinctions, relying on cherry-picked examples and reifying a vision of a ‘society’ that’s made up mostly of the white and middle-class. In an article in The Chronicle of Higher Education in 2009 on the pundits and consultants who market information about ‘millennials’ to universities, Eric Hoover described Howe and Strauss’s influential book about that generation, Millennials Rising: The Next Great Generation (2000), as a work ‘based on a hodgepodge of anecdotes, statistics, and pop-culture references’ with the only new empirical evidence being a body of around 600 interviews of high-school seniors, all living in wealthy Fairfax County, Virginia.

Hoover interviewed several people in higher education who voiced their doubts about the utility of Howe and Strauss’s approach. Their replies, informed by their experience teaching college students from across the socioeconomic spectrum, show how useless the schematic understanding of ‘millennials’ looks when you’re working with actual people. Palmer H Muntz, then the director of admissions of Lincoln Christian University in Illinois, noticed that plenty of kids he encountered on visits to less-privileged schools weren’t intensely worried about grades or planning, like the stereotypical millennial. Fred A Bonner II, now at Prairie View A & M University in Texas, pointed out that many of the supposed ‘personality traits’ of coddled and pressured millennials were unrecognisable to his black or Hispanic students, or those who grew up with less money. Siva Vaidhyanathan, a cultural historian and media scholar at the University of Virginia, told Hoover: ‘Generational thinking is just a benign form of bigotry.’"



"Ryder had harsh words for the theorists he called ‘generationists’. He argued that thinkers about generation on a large scale had made illogical leaps when theorising the relationship between generations and social change. ‘The fact that social change produces intercohort differentiation and thus contributes to inter-generational conflict,’ he argued, ‘cannot justify a theory that social change is produced by that conflict.’ There was no way to prove causality. The end result, he wrote, was that grand generational theories tended toward ‘arithmetical mysticism.’"



"As the French historian Pierre Nora wrote in 1996, the careful analyst trying to talk about generations will always struggle: ‘The generational concept would make a wonderfully precise instrument if only its precision didn’t make it impossible to apply to the unclassifiable disorder of reality.’ The problem with transferring historical and sociological ways of thinking about generational change into the public sphere is that ‘unclassifiability’ is both terrifying and boring. Big, sweeping explanations of social change sell. Little, careful studies of same-age cohorts, hemmed in on all sides by rich specificity, do not.

Perhaps the pseudoscientific use of supposed ‘generations’ would irk less if it weren’t so often used to demean the young. Millennials, consultants advise prospective employers, feel entitled to good treatment even in entry-level jobs, because they’ve been overpraised their whole lives. Millennials won’t buckle down and buy cars or houses, economists complain; millennials are lurking in their parents’ basements, The New Yorker cartoon stereotype runs, tweeting and texting and posting selfies and avoiding responsibility."



"Popular millennial backlash against the stereotyping of their generation makes use of the same arguments against generational thinking that sociologists and historians have spent years developing. By drawing attention to the effects of the economic situation on their lives, pointing out that human experience isn’t universal and predictable, and calling upon adults to abandon broad assessments in favour of specific understanding, millennials prove the point: generational thinking is seductive, and for some of us it confirms our preconceived prejudices, but it’s fatally flawed as a mode of understanding the world. Real life is not science fiction."
rebeccaonion  generationalthinking  generations  age  ageism  complexity  humans  society  adaptability  independence  history  individuals  neilhowe  williamstrauss  stereotypes  lifecourse  palmermuntz  sivavaidhyanathan  agesegregation  millenials  genx  generationx  generationy  erichoover  karlmannheimaugusteconte  gottfriedleibniz  normanryder  sociology  causality  robertwohl  pierrenora  bigotry  generationalwarfare  malcolmharris  digitalnatives  hypocrisy  via:ayjay 
may 2015 by robertogreco
Education’s war on millennials: Why everyone is failing the “digital generation” - Salon.com
"Both reformers and traditionalists view technology as a way to control students — and they're getting it very wrong"



"In addressing the hundreds of thousands who watch such videos, students aren’t the only ones in the implied audience. These videos appeal to many nonacademic viewers who enjoy watching, from a remove, the hacking of obstreperous or powerful systems as demonstrated in videos about, for instance, fooling electronic voting booths, hacking vending machines, opening locked cars with tennis balls, or smuggling contraband goods through airport x-ray devices. These cheating videos also belonged to a broader category of YouTube videos for do-it-yourself (DIY) enthusiasts— those who liked to see step-by-step execution of a project from start to finish. YouTube videos about crafts, cooking, carpentry, decorating, computer programming, and installing consumer technologies all follow this same basic format, and popular magazines like Make have capitalized on this sub-culture of avid project-based participants. Although these cultural practices may seem like a relatively new trend, one could look at DIY culture as part of a longer tradition of exercises devoted to imitatio, or the art of copying master works, which have been central to instruction for centuries."



"Prior to the release of this report, Mia Consalvo had argued that cheating in video games is expected behavior among players and that cheaters perform important epistemological work by sharing information about easy solutions on message boards, forums, and other venues for collaborations.

Consalvo also builds on the work of literacy theorist James Paul Gee, who asserts that video game narratives often require transgression to gain knowledge and that, just as passive obedience rarely produces insight in real classrooms, testing boundaries by disobeying the instructions of authority figures can be the best way to learn. Because procedural culture is ubiquitous, however, Ian Bogost has insisted that defying rules and confronting the persuasive powers of certain architectures of control only brings other kinds of rules into play, since we can never really get outside of ideology and act as truly free agents, even when supposedly gaming the system.

Ironically, more traditional ideas about fair play might block key paths to upward mobility and success in certain high-tech careers. For example, Betsy DiSalvo and Amy Bruckman, who have studied Atlanta-area African-American teens involved in service learning projects with game companies, argue that the conflict between the students’ own beliefs in straightforward behavior and the ideologies of hacker culture makes participation in the informal gateway activities for computer science less likely. Thus, urban youth who believe in tests of physical prowess, basketball-court egalitarianism, and a certain paradigm of conventional black masculinity that is coded as no-nonsense or—as Fox Harrell says—“solid” might be less likely to take part in forms of “geeking out” that involve subverting a given set of rules. Similarly, Tracy Fullerton has argued that teenagers from families unfamiliar with the norms of higher education may also be hobbled by their reluctance to “strategize” more opportunistically about college admissions. Fullerton’s game “Pathfinder” is intended to help such students learn to game the system by literally learning to play a game about how listing the right kinds of high-status courses and extracurricular activities will gain them social capital with colleges."



"However, Gee would later argue in “The Anti-Education Era” that gamesmanship that enables universal access and personal privilege may actually be extremely counterproductive. Hacks that “make the game easier or advantage the player” can “undermine the game’s design and even ruin the game by making it too easy.” Furthermore, “perfecting the human urge to optimize” can go too far and lead to fatal consequences on a planet where resources can be exhausted too quickly and weaknesses can be exploited too frequently. Furthermore, Gee warns that educational systems that focus on individual optimization create cultures of “impoverished humans” in which learners never “confront challenge and frustration,” “acquire new styles of learning,” or “face failure squarely.”"



"What’s striking about the ABC coverage is that it lacked any of the criticism of the educational status quo that became so central for a number of readers of the earlier Chronicle of Higher Education story—those who were asking as educators either (1) what’s wrong with the higher education system that students can subvert conventional tests so easily, or (2) what’s right with YouTube culture that encourages participation, creativity, institutional subversion, and satire."



"This attitude reflects current research on so-called distributed cognition and how external markers can help humans to problem solve by both making solutions clearer and freeing up working memory that would otherwise be tied up in reciting basic reminders. Many of those commenting on the article also argued that secrecy did little to promote learning, a philosophy shared by Benjamin Bratton, head of the Center for Design and Geopolitics, who actually hands out the full text of his final examination on the first day of class so that students know exactly what they will be tested on."



"This book explores the assumption that digital media deeply divide students and teachers and that a once covert war between “us” and “them” has turned into an open battle between “our” technologies and “their” technologies. On one side, we—the faculty—seem to control course management systems, online quizzes, wireless clickers, Internet access to PowerPoint slides and podcasts, and plagiarism-detection software. On the student side, they are armed with smart phones, laptops, music players, digital cameras, and social network sites. They seem to be the masters of these ubiquitous computing and recording technologies that can serve as advanced weapons allowing either escape to virtual or social realities far away from the lecture hall or—should they choose to document and broadcast the foibles of their faculty—exposure of that lecture hall to the outside world.

Each side is not really fighting the other, I argue, because both appear to be conducting an incredibly destructive war on learning itself by emphasizing competition and conflict rather than cooperation. I see problems both with using technologies to command and control young people into submission and with the utopian claims of advocates for DIY education, or “unschooling,” who embrace a libertarian politics of each-one-for-himself or herself pedagogy and who, in the interest of promoting totally autonomous learning in individual private homes, seek to defund public institutions devoted to traditional learning collectives. Effective educators should be noncombatants, I am claiming, neither champions of the reactionary past nor of the radical future. In making the argument for becoming a conscientious objector in this war on learning, I am focusing on the present moment.

Both sides in the war on learning are also promoting a particular causal argument about technology of which I am deeply suspicious. Both groups believe that the present rupture between student and professor is caused by the advent of a unique digital generation that is assumed to be quite technically proficient at navigating computational media without formal instruction and that is likely to prefer digital activities to the reading of print texts. I’ve been a public opponent of casting students too easily as “digital natives” for a number of reasons. Of course, anthropology and sociology already supply a host of arguments against assuming preconceived ideas about what it means to be a native when studying group behavior.

I am particularly suspicious of this type of language about so-called digital natives because it could naturalize cultural practices, further a colonial othering of the young, and oversimplify complicated questions about membership in a group. Furthermore, as someone who has been involved with digital literacy (and now digital fluency) for most of my academic career, I have seen firsthand how many students have serious problems with writing computer programs and how difficult it can be to establish priorities among educators—particularly educators from different disciplines or research tracks—when diverse populations of learners need to be served."



"Notice not only how engagement and interactivity are praised and conflated, but also how the rhetoric of novelty in consumer electronics and of short attention spans also comes into play."
education  technology  edtech  control  reform  policy  power  2014  traditionalism  traditionalists  plagiarism  pedagogy  learning  schools  cheating  multitasking  highered  highereducation  politics  elizabethlosh  mimiito  ianbogost  jamespaulgee  homago  betsydisalvo  amybruckman  foxharrell  geekingout  culture  play  constraints  games  gaming  videogames  mckenziewark  janemcgonigal  gamesmanship  internet  youtube  secrecy  benjaminbratton  unschooling  deschooling  collaboration  cooperation  agesegregation  youth  teens  digitalnatives  marshallmcluhan  othering  sivavaidhyanathan  digital  digitalliteracy  attention  engagement  entertainment  focus  cathydavidson 
june 2014 by robertogreco

Copy this bookmark:





to read