recentpopularlog in

robertogreco : adelinekoh   2

Peripetatic Humanities - YouTube
"A lecture about Mark Sample's "Notes Toward a Deformed Humanities," featuring ideas by Lisa Rhody, Matt Kirchenbaum, Steve Ramsay, Barthes, Foucault, Bahktin, Brian Croxall, Dene Grigar, Roger Whitson, Adeline Koh, Natalia Cecire, and Ian Bogost & the Oulipo, a band opening for The Carpenters."
kathiinmanberens  performance  humanities  deformity  marksample  lisarhody  mattkirchenbaum  steveramsay  foucault  briancroxall  denegrigar  rogerwhitson  adelinekoh  ianbogost  oulipo  deformance  humptydumpty  repair  mikhailbakhtin  linearity  alinear  procedure  books  defamiliarization  reading  howweread  machines  machinereading  technology  michelfoucault  rolandbarthes  nataliacecire  disruption  digitalhumanities  socialmedia  mobile  phones  making  computation  computing  hacking  nonlinear 
february 2018 by robertogreco
Can Algorithms Replace Your English Professor? — Who’s Afraid of Online Education? — Medium
"Algorithms are quickly becoming our new tastemakers and gatekeepers. Social media feeds are increasingly the most immediate source of news for many people, which means we are becoming more and more beholden to algorithms. Social media algorithms have been a popular topic of discussion lately, with people undertaking experiments on what happens when you “like” everything on Facebook, or when you refrain from “liking” anything. The Facebook algorithm is being held up as the primary reason why the #Ferguson protests are not showing up on user’s Facebook feeds, in comparison to Twitter, which is the only network that shows you what you choose to follow, rather than what its algorithm thinks you should. (Note that this may also be changing.)

Algorithms are becoming our curators. They show us—based on a secret, proprietary formula—what they think we want to see. In this experiment, Tim Herrara demonstrates that Facebook’s algorithm prefers to show its users older, more popular content than new content that has not been engaged with. Despite him trying to consume his entire Facebook feed for an entire day, he realized that he only saw 29% of new content produced by his network—and that for most users, that percentage is probably a lot lower. On Facebook there isn’t a way to bypass this algorithm, even if you select“most recent” posts rather than “most popular” posts in your setting (interestingly enough, I’ve heard reports that Facebook tends to secretly reset your settings back to “most popular” no matter what you do).

There’s a lot of controversy over the power that we are giving algorithms to display and represent our world to us. But these critiques miss an important point: we’ve never not had curators and filters. Before we had algorithms, we had “experts”, “authorities”, tastemakers—we had (or have)professors and academics, we had (have) institutions that studied things and told us what was important or unimportant about the world, we had (have) editors and publishers who decided what was “good” enough to be shared with the world. But the importance and reliabilty of these authorities and tastemakers is coming under serious fire because of the impact of some social media; for example in the reporting on Ferguson on major news networks versus Twitter. Furthermore, if you take the work of postcolonial studies critics like Edward Said seriously, much of our humanistic and scientific forms of research inquiry are hardly free of cultural prejudice, and are in fact informed and dictated by these modes of thinking.

Given all of this, I have two thoughts:

One. How is algorithmic selection actually similar to older modes of tastemaking and gatekeeping (i.e. experts and authorities who tell us what to value and what not to)? How is it different? Does either mode entertain the feedback of those who they serve (i.e., can you help train an algorithm to show you more of what you want, or can you have impact on your “experts” in having them study what you think is important?)

Two. A great deal of virtual ink has been spilled on whether educators are going to be replaced by online courses such as MOOCs. Less has been said, however, about the replacement of the tastemaking function of educators/researchers—especially in the humanities, our goal has been to train students to find value in what they otherwise might not, to make legible to our students modes of seeing and doing which depart from their own. Can an algorithm replace that tastemaking function? Put another way: instead of having the “best” news and information filtered to you by “experts” (your teachers, your professors, editors and publishers etc.), what happens when an algorithm starts taking over this process? Is this necessarily good, bad, or neither? And how similar is this filtering of information to previous modes of filtering? In other words—can an algorithm become smart enough to replace your English literature professor? And what would be the result of such a scenario?"

[via (great thread follows): https://twitter.com/Jessifer/status/502632112261169152 ]
adelinekoh  2014  algorithms  facebook  twitter  education  curation  curators  gatekeepers  tastemakers  trendsetters  mooc  moocs  tastemaking  experts  authority  authorities  humanism  humanities  power  control  academia  highereducation  highered  feeds  filters 
august 2014 by robertogreco

Copy this bookmark:





to read