recentpopularlog in

jerryking : cathy_o’neil   14

When algorithms reinforce inequality
FEBRUARY 9, 2018 | FT | Gillian Tett.

Virginia Eubanks, a political science professor in New York, undertakes academic research was focused on digital innovation and welfare claims. ......Last month, she published Automating Inequality, a book that explores how computers are changing the provision of welfare services in three US regions: Indiana, Los Angeles and Pittsburgh. It focuses on public sector services, rather than private healthcare insurance, but the message is the same: as institutions increasingly rely on predictive algorithms to make decisions, peculiar — and often unjust — outcomes are being produced. And while well-educated, middle-class people will often fight back, most poor or less educated people cannot; nor will they necessarily be aware of the hidden biases that penalise them....Eubanks concludes, is that digital innovation is reinforcing, rather than improving, inequality. ...What made the suffering doubly painful when the computer programs got it wrong was that the victims found it almost impossible to work out why the algorithms had gone against them, or to find a human caseworker to override the decision — and much of this could be attributed to a lack of resources....a similar pattern is described by the mathematician Cathy O’Neil in her book Weapons of Math Destruction. “Ill-conceived mathematical models now micromanage the economy, from advertising to prisons,” she writes. “They’re opaque, unquestioned and unaccountable and they ‘sort’, target or optimise millions of people . . . exacerbating inequality and hurting the poor.”...Is there any solution? O’Neil and Eubanks suggest that one option would be to require technologists to sign something equivalent to the Hippocratic oath, to “first do no harm”. A second — more costly — idea would be to force institutions using algorithms to hire plenty of human caseworkers to supplement the digital decision-making.

A third idea would be to ensure that the people who are creating and running the computer programs are forced to think about culture, in its broadest sense.....until now digital nerds at university have often had relatively little to do with social science nerds — and vice versa.

Computing has long been perceived to be a culture-free zone — this needs to change. But change will only occur when policymakers and voters understand the true scale of the problem. This is hard when we live in an era that likes to celebrate digitisation — and where the elites are usually shielded from the consequences of those algorithms.
Gillian_Tett  Cathy_O’Neil  algorithms  inequality  biases  books  dark_side  Pittsburgh  poverty  low-income 
february 2018 by jerryking
The Ivory Tower Can’t Keep Ignoring Tech
NOV. 14, 2017 | The New York Times | By Cathy O’Neil is a data scientist and author of the book “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Follow her on Twitter at @mathbabedotorg.

We urgently need an academic institute focused on algorithmic accountability.

First, it should provide a comprehensive ethical training for future engineers and data scientists at the undergraduate and graduate levels, with case studies taken from real-world algorithms that are choosing the winners from the losers. Lecturers from humanities, social sciences and philosophy departments should weigh in.

Second, this academic institute should offer a series of workshops, conferences and clinics focused on the intersection of different industries with the world of A.I. and algorithms. These should include experts in the content areas, lawyers, policymakers, ethicists, journalists and data scientists, and they should be tasked with poking holes in our current regulatory framework — and imagine a more relevant one.

Third, the institute should convene a committee charged with reimagining the standards and ethics of human experimentation in the age of big data, in ways that can be adopted by the tech industry.

There’s a lot at stake when it comes to the growing role of algorithms in our lives. The good news is that a lot could be explained and clarified by professional and uncompromised thinkers who are protected within the walls of academia with freedom of academic inquiry and expression. If only they would scrutinize the big tech firms rather than stand by waiting to be hired.
algorithms  accountability  Cathy_O’Neil  Colleges_&_Universities  data_scientists  ethics  inequality  think_tanks  Big_Tech 
november 2017 by jerryking
Algorithms Aren’t Biased, But the People Who Write Them May Be - WSJ
By JO CRAVEN MCGINTY
Oct. 14, 2016

A provocative new book called “Weapons of Math Destruction” has inspired some charged headlines. “Math Is Racist,” one asserts. “ Math Is Biased Against Women and the Poor,” declares another.

But author Cathy O’Neil’s message is more subtle: Math isn’t biased. People are biased.

Dr. O’Neil, who received her Ph.D in mathematics from Harvard, is a former Wall Street quant who quit after the housing crash, joined the Occupy Wall Street movement and now publishes the mathbabe blog.
algorithms  mathematics  biases  books  Cathy_O’Neil  Wall_Street  PhDs  quants  Occupy_Wall_Street  Harvard  value_judgements 
october 2016 by jerryking
Do we really want elite youth to get more elite? | mathbabe
December 16, 2013 Cathy O'Neil,

Finally, let me just take one last swipe at this idea from the perspective of “it’s meritocratic therefore it’s ok”. It’s just plain untrue that test-taking actually exposes talent. It’s well established that you can get better at these tests through practice, and that richer kids practice more. So the idea that we’re going to establish a level playing field and find minority kids to elevate this way is rubbish. If we do end up focusing more on the high end of test-takers, it will be completely dominated by the usual suspects.

In other words, this is a plan to make elite youth even more elite. And I don’t know about you, but my feeling is that’s not going to help our country overall.
education  PISA  elitism  meritocratic  Cathy_O’Neil  compounded  self-perpetuation  Matthew_effect  opportunity_gaps  privilege  high-end  cumulative  unfair_advantages 
december 2013 by jerryking
Minorities possible unfairly disqualified from opening bank accounts | mathbabe
August 7, 2013 Cathy O'Neil,

New York State attorney general Eric T. Schneiderman’s investigation into possibly unfair practices by big banks using opaque and sometimes erroneous databases to disqualify people from opening accounts.

Not much hard information is given in the article but we know that negative reports stemming from the databases have effectively banished more than a million lower-income Americans from the financial system, and we know that the number of “underbanked” people in this country has grown by 10% since 2009. Underbanked people are people who are shut out of the normal banking system and have to rely on the underbelly system including check cashing stores and payday lenders....The second, more interesting point – at least to me – is this. We care about and defend ourselves from our constitutional rights being taken away but we have much less energy to defend ourselves against good things not happening to us.

In other words, it’s not written into the constitution that we all deserve a good checking account, nor a good college education, nor good terms on a mortgage, and so on. Even so, in a large society such as ours, such things are basic ingredients for a comfortable existence. Yet these services are rare if not nonexistent for a huge and swelling part of our society, resulting in a degradation of opportunity for the poor.

The overall effect is heinous, and at some point does seem to rise to the level of a constitutional right to opportunity, but I’m no lawyer.

In other words, instead of only worrying about the truly bad things that might happen to our vulnerable citizens, I personally spend just as much time worrying about the good things that might not happen to our vulnerable citizens, because from my perspective lots of good things not happening add up to bad things happening: they all narrow future options.
visible_minorities  discrimination  data  data_scientists  banks  banking  unbanked  equality  equality_of_opportunity  financial_system  constitutional_rights  payday_lenders  Cathy_O’Neil  optionality  opportunity_gaps  low-income 
december 2013 by jerryking
Bill Gates is naive, data is not objective | mathbabe
January 29, 2013 Cathy O'Neil,

Don’t be fooled by the mathematical imprimatur: behind every model and every data set is a political process that chose that data and built that model and defined success for that model.
billgates  naivete  data  Cathy_O’Neil  value_judgements  datasets  biases 
december 2013 by jerryking
Open data is not a panacea | mathbabe
December 29, 2012 Cathy O'Neil,
And it’s not just about speed. You can have hugely important, rich, and large data sets sitting in a lump on a publicly available website like wikipedia, and if you don’t have fancy parsing tools and algorithms you’re not going to be able to make use of it.

When important data goes public, the edge goes to the most sophisticated data engineer, not the general public. The Goldman Sachs’s of the world will always know how to make use of “freely available to everyone” data before the average guy.

Which brings me to my second point about open data. It’s general wisdom that we should hope for the best but prepare for the worst. My feeling is that as we move towards open data we are doing plenty of the hoping part but not enough of the preparing part.

If there’s one thing I learned working in finance, it’s not to be naive about how information will be used. You’ve got to learn to think like an asshole to really see what to worry about. It’s a skill which I don’t regret having.

So, if you’re giving me information on where public schools need help, I’m going to imagine using that information to cut off credit for people who live nearby. If you tell me where environmental complaints are being served, I’m going to draw a map and see where they aren’t being served so I can take my questionable business practices there.
open_data  unintended_consequences  preparation  skepticism  naivete  no_regrets  Goldman_Sachs  tools  algorithms  Cathy_O’Neil  thinking_tragically  slight_edge  sophisticated  unfair_advantages  smart_people  data_scientists  gaming_the_system  dark_side 
december 2013 by jerryking

Copy this bookmark:





to read