recentpopularlog in

laurakalbag : algorithms   49

Another tax on the poor: Surrendering privacy for survival
Undocumented immigrants, day laborers, homeless people, and those with criminal convictions suffer from another data extreme: living beyond the reach of the data collection systems needed to thrive in society, they gain so much “privacy” that they become increasingly invisible. Living in this surveillance gap can be as damaging as living under constant surveillance, and is often a reaction to it.
systemicdiscrimination  privacy  algorithms  indie  radar 
march 2019 by laurakalbag
Study finds a potential risk with self-driving cars: failure to detect dark-skinned pedestrians - Vox
In addition to worrying about how safe they are, how they’d handle tricky moral trade-offs on the road, and how they might make traffic worse, we also need to worry about how they could harm people of color.

If you’re a person with dark skin, you may be more likely than your white friends to get hit by a self-driving car, according to a new study out of the Georgia Institute of Technology. That’s because automated vehicles may be better at detecting pedestrians with lighter skin tones.
algorithms  ai  recognition  selfdrivingcars  indie  radar 
march 2019 by laurakalbag
Don’t look now: why you should be worried about machines reading your emotions | Technology | The Guardian
But it is precisely this type of algorithmic judgment based on markers like ethnicity that worries Whittaker most about emotion detection technology, suggesting a future of automated physiognomy. In fact, there are already companies offering predictions for how likely someone is to become a terrorist or pedophile, as well as researchers claiming to have algorithms that can detect sexuality from the face alone.

Several studies have also recently shown that facial recognition technologies reproduce biases that are more likely to harm minority communities.
algorithms  facialrecognition  emotion  machinelearning  surveillance  racism 
march 2019 by laurakalbag
Happy 21st Century! - Charlie's Diary
Shorter version is: there will be much dying: even more so than during the worst conflicts of the 20th century. But rather than conventional wars ("nation vs nation") it'll be "us vs them", where "us" and "them" will be defined by whichever dehumanized enemy your network filter bubble points you at—Orwell was ahead of the game with the Two Minute Hate, something with which all of us who use social media are now uncomfortably, intimately, familiar.
future  dystopia  algorithms  ai  politics 
june 2018 by laurakalbag
Palantir: the ‘special ops’ tech giant that wields as much real-world power as Google | World news | The Guardian
Data merely becomes a new way of reinforcing old prejudices. Critics of these analytics argue that from the moment a police officer with the pre-crime mindset that you are a criminal steps out of their patrol car to confront you, your fate has been sealed.
palantir  surveillance  bigdata  algorithms  analytics  indie  radar 
july 2017 by laurakalbag
This Is How Your Fear and Outrage Are Being Sold for Profit
The digital rabbit hole you just tumbled down is funded by advertising, aimed at you. Almost every “free” app or service you use depends on this surreptitious process of unconsciously turning your eyeballs into dollars, and they have built sophisticated methods of reliably doing it. You don’t pay money for using these platforms, but make no mistake, you are paying for them — with your time, your attention, and your perspective.
media  press  clickbait  journalism  algorithms 
july 2017 by laurakalbag
Facebook wants to own our news and communications. Here's why it has to fail.
“As a society, we tend to raise a skeptical eyebrow at nations with state-controlled media. But a world where one major entity controls the distribution of news, not just nationally but internationally, is arguably more harmful. I don't believe that Facebook as a company has nefarious intent here, but the precedent is worrying. News and information are important to democracy; informed voters are better voters. The implications of a single algorithm controlling how a population gets its news goes far beyond a web company trying to increase its market share.” By Ben Werdmüller
facebook  news  algorithms  influence  indie  indieroundup  27mar2015 
march 2015 by laurakalbag
An Important Notice from Amazon
“Dear Amazon Preferred Customer,

As you know, twelve years ago you registered for 1-Click, entrusting to us your name, address, and credit-card information. In return, you experienced the convenience of purchasing almost any item that we sell with a single mouse-click. The savings that you have enjoyed, in terms of clicks, have been substantial.” Satire by Ellis Weiner
indieroundup  indie  amazon  algorithms  tracking  corporatesurveillance  notincluded 
march 2015 by laurakalbag
The Politics of Trending
“Through this lens, trending is merely visibility granted by the algorithms of a closed, private corporation… Indeed, Twitter is not designed as public even as it fundamentally derives from public input and data, and parades as common grounds.” By Eunsong Kim on Model View Culture
trending  trends  algorithms  politics  privatisation  publicspaces  indie  indieroundup  20mar15 
march 2015 by laurakalbag
Computer-based personality judgments are more accurate than those made by humans
“This study compares the accuracy of personality judgment—a ubiquitous and important social-cognitive activity—between computer models and humans. Using several criteria, we show that computers’ judgments of people’s personalities based on their digital footprints are more accurate and valid than judgments made by their close others or acquaintances (friends, family, spouse, colleagues, etc.). Our findings highlight that people’s personalities can be predicted automatically and without involving human social-cognitive skills.” Scientific study
study  algorithms  facebook  likes  bigdata  indie  corporatesurveillance 
march 2015 by laurakalbag
Facebook knows you better than your members of your own family
“Cambridge University researchers have developed software based on Facebook 'likes' which can predict human personality types better than close family members” By Sarah Knapton on The Telegraph
study  facebook  likes  bigdata  algorithms  indie  corporatesurveillance 
march 2015 by laurakalbag
Your Facebook ‘Likes’ May Be More Revealing Than You Think
“A study shows that what you 'like' on Facebook can predict, with remarkable accuracy, everything from your race to your sexual orientation, political affiliation and personality type.” By Maia Szalavitz on
indie  algorithms  bigdata  corporatesurveillance  tracking  facebook  likes 
march 2015 by laurakalbag
Facebook Likes help algorithms get to know you like family
“Switch over to present day reality, where a new study published today in the journal PNAS, reveals that by mining Facebook Likes, computers can suss out your personality traits better than your nearest and dearest. Not quite an operating systems love story, but surreal enough.” By Emiko Jozuka on Wired UK
algorithms  facebook  bigdata  corporatesurveillance  tracking  indie 
march 2015 by laurakalbag
How algorithms shape our world
“We live in a world run by algorithms, computer programs that make decisions or solve problems for us. In this riveting, funny talk, Kevin Slavin shows how modern algorithms determine stock prices, espionage tactics, even the movies you watch. But, he asks: If we depend on complex algorithms to manage our daily decisions — when do we start to lose control?” TED Talk by Kevin Slavin
data  bigdata  algorithms  indie  indieroundup  20mar15 
march 2015 by laurakalbag
Is Differential Privacy practical?
“Energy data might not be the first thing that comes to mind when you think of privacy concerns. Yet, it is well understood that smart meter data can be highly revealing. There is a large body of research devoted to drawing inferences from smart meter readings.” By Moritz Hardt on Moody Rd
privacy  bigdata  data  indie  indieroundup  13mar15  algorithms  statistics 
march 2015 by laurakalbag
How big data is unfair
“I don’t mean to suggest that machine learning is inevitably unfair, but rather that there are powerful forces that can render decision making that depends on learning algorithms unfair.” By Moritz Hardt on Medium
bigdata  data  algorithms  diversity  indie  indieroundup  13mar15 
march 2015 by laurakalbag

Copy this bookmark:

to read