recentpopularlog in

charlesarthur : race   8

I was reported to police as an agitated black male — for simply walking to work • Medium
Reginald Andrade:
<p>on September 14, campus police were waiting for me when I arrived at the reception desk at Whitmore. I had no idea why but I knew it couldn’t be good. My heart started pounding.

Two university detectives sat me down me in an office and closed the door. Bewildered, I asked what was happening. They refused to answer, as they peppered me with questions.
“What time did you wake up?” “What were you doing at the campus recreation center?” “Did you come into the building agitated?” I felt confused, powerless, and scared, but made sure to maintain my composure. I remembered that even unarmed Black people disproportionately get killed during police encounters, and it was incumbent on me as an innocent Black man to show that I wasn’t a threat. It wasn’t until the end of their interrogation that they revealed why I was being questioned.

Someone had called the university’s anonymous tip line, reporting that they had seen an “agitated Black male” who was carrying a “a heavy backpack that is almost hitting the ground” as he approached the Whitmore Administration Building. I — the “agitated Black male” — apparently posed such a threat that police put the entire building on lockdown for half an hour.

I have no idea how the caller come to the conclusion that I was “agitated,” considering they hadn’t interacted with me. I do know that Black people are often stereotyped as angry, armed, or dangerous.

I’ve had to answer to the police before for being a Black man at UMass Amherst.</p>

Sometimes America's problems feel intractable. Another story going around on Thursday: "Georgia woman calls police on black man babysitting white kids: Corey Lewis, who runs a youth mentoring program, was followed by a white woman from a Walmart to his mother's home."
Race  america 
october 2018 by charlesarthur
Ai facial recognition works better for white skin - because it's being trained that way • World Economic Forum
Larry Hardesty:
<p>Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency.

In the researchers’ experiments, the three programs’ error rates in determining the gender of light-skinned men were never worse than 0.8%. For darker-skinned women, however, the error rates ballooned — to more than 20% in one case and more than 34% in the other two.

The findings raise questions about how today’s neural networks, which learn to perform computational tasks by looking for patterns in huge data sets, are trained and evaluated. For instance, according to the paper, researchers at a major US technology company claimed an accuracy rate of more than 97% for a face-recognition system they’d designed. But the data set used to assess its performance was more than 77% male and more than 83% white.

“What’s really important here is the method and how that method applies to other applications,” says Joy Buolamwini, a researcher in the MIT Media Lab’s Civic Media group and first author on the new paper. “The same data-centric techniques that can be used to try to determine somebody’s gender are also used to identify a person when you’re looking for a criminal suspect or to unlock your phone. And it’s not just about computer vision. I’m really hopeful that this will spur more work into looking at [other] disparities.”</p>

Would love to know which big American company that was.
Race  ai  gender  Facialrecognition 
february 2018 by charlesarthur
Black people in tech are still paid less than white people, according to Hired • Techcrunch
Megan Rose Dickey:
<p>Pay discrimination and discrepancies based on gender and race are nothing new. Unfortunately, it seems that little has changed over the years.

In the tech industry, white people on average make $136,000 a year, which is about $6,000 more than black people with the same level of expertise. It also turns out white tech workers ask for more money, according to Hired’s data. Hired’s data is based on its marketplace of over 69,000 people and 10,000 companies.

“The racial gap may be partially a result of black and hispanic tech workers undervaluing their skills, which is symptom of being underpaid in previous roles,” Hired CEO Mehul Patel said in a blog post. “Black and hispanic candidates on the Hired platform set their preferred salaries lowest ($124K). Ultimately though, Hispanic candidates are offered $1K more than their black counterparts. For comparison, white tech workers ask for an average of $130K and Asian tech workers ask for an average of $127K.”

It also turns out people who identify as multiracial receive less than people who identify as one race.</p>
Race  discrimination 
february 2018 by charlesarthur
Want to fix gun violence in America? Go local • The Guardian
Aliza Aufrichtig, Lois Beckett, Jan Diehm and Jamiles Lartey:
<p>Half of America's gun homicides in 2015 were clustered in just 127 cities and towns, according to a new geographic analysis by the Guardian, even though they contain less than a quarter of the nation’s population.

Even within those cities, violence is further concentrated in the tiny neighborhood areas that saw two or more gun homicide incidents in a single year.

<img src="https://interactive.guim.co.uk/atoms/2016/12/local-guns/v/1484059933/files/images/USmap-Artboard_2.png" width="100%" />

Four and a half million Americans live in areas of these cities with the highest numbers of gun homicide, which are marked by intense poverty, low levels of education, and racial segregation. Geographically, these neighborhood areas are small: a total of about 1,200 neighborhood census tracts, which, laid side by side, would fit into an area just 42 miles wide by 42 miles long.

The problem they face is devastating. Though these neighborhood areas contain just 1.5% of the country’s population, they saw 26% of America’s total gun homicides.

Gun control advocates say it is unacceptable that Americans overall are "25 times more likely to be murdered with a gun than people in other developed countries". People who live in these neighborhood areas face an average gun homicide rate about 400 times higher than the rate across those high-income countries.</p>


Amazing piece of data journalism, digging down to the neighbourhood level: gun murder is a more common act where poverty, lack of education and racial segregation are high.
maps  crime  race  guns  america 
january 2018 by charlesarthur
Facebook still lets landlords discriminate by race and disability in apartment ads • Gizmodo
Matt Novak:
<p>ProPublica purchased a number of different housing ads last week, but asked that they be <a href="https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin?token=azi~Txt72PLEjyR99sF_pSH~fRuIBRsj">unavailable to certain segments of the population</a>. Ads that would exclude Jews, black people, and Americans originally from Argentina were reportedly all approved within minutes. Facebook’s targeting also allows advertisers to exclude other groups, such as people interested in wheelchair access and parents with high school-aged kids. These ads were approved quickly thanks to Facebook’s algorithmic approval process.

According to ProPublica, just one type of ad took longer than mere minutes for approval and that was a test ad excluding people who were interested in Islam. That ad was ultimately approved in just 22 minutes.

Obviously, all of these ads are in direct violation of US Fair Housing laws. But Facebook appears to have done nothing to ensure that it’s in compliance, despite swearing that it would set up safeguards. ProPublica asked Facebook about the ads and the company blamed it on a “technical failure,” rather than a systematic and inexcusable disinterest in adhering to US law.</p>


OK, and now wait for Facebook's response. What do you think it will be?
<p>“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments,” a Facebook spokesperson told Gizmodo. "…The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure,” the spokesperson said.</p>
facebook  race  advertising 
november 2017 by charlesarthur
The perpetual line-up: unregulated police face recognition in America • Center on Privacy and Technology
Clare Garvie, Alvaro Bedoya and Jonathan Frankle, in a substantial report:
<p>Human vision is biased: We are good at identifying members of our own race or ethnicity, and by comparison, bad at identifying almost everyone else.214 Yet many agencies using face recognition believe that machine vision is immune to human bias. In the words of one Washington police department, face recognition simply “does not see race.”215

The reality is far more complicated. Studies of racial bias in face recognition algorithms are few and far between. The research that has been done, however, suggests that these systems do, in fact, show signs of bias. The most prominent study, co-authored by an FBI expert, found that several leading algorithms performed worse on African Americans, women, and young adults than on Caucasians, men, and older people, respectively.216 In interviews, we were surprised to find that two major face recognition companies did not test their algorithms for racial bias.217

Racial bias intrinsic to an algorithm may be compounded by outside factors. African Americans are disproportionately likely to come into contact with—and be arrested by—law enforcement.218 This means that police face recognition may be overused on the segment of the population on which it underperforms. It also means that African Americans will likely be overrepresented in mug shot-based face recognition databases. Finally, when algorithms search these databases, the task of selecting a final match is often left to humans, even though this may only add human bias back into the system.</p>


People keep forgetting the basic rule - these systems can only learn from what you teach them.
ai  data  ethics  crime  race  bias 
march 2017 by charlesarthur
Did a study really find there aren’t racial disparities in police shootings? Not so fast. • Vox
German Lopez:
<p>Harvard economist <a href="http://www.nber.org/papers/w22399">Roland Fryer’s new study</a>… analyzed data from several police departments across the country to measure racial differences in police use of force. Quoctrung Bui and Amanda Cox reported:

A new study confirms that black men and women are treated differently in the hands of law enforcement. They are more likely to be touched, handcuffed, pushed to the ground or pepper-sprayed by a police officer, even after accounting for how, where and when they encounter the police.

But when it comes to the most lethal form of force — police shootings — the study finds no racial bias.

But diving deeper into the study, those conclusions are based on some fairly shaky ground. Specifically, the data the study uses only looks at racial biases after a police officer engages with a suspect. That excludes a key driver of racial biases in policing: that police are more likely to stop black people in the first place, producing far more situations in which someone is likely to be shot. The study also looks at a fairly limited number of police departments, meaning its findings may not apply nationwide.</p>


It's good that there is available data; it's bad that the topic has to be addressed. In the UK there have been similar complaints about "stop and search" as being racially driven - and, sometimes, leading to deaths in custody.
race  police 
july 2016 by charlesarthur
Whites earn more than blacks — even on eBay » The Washington Post
Ana Swanson:
<p>In a <a href="http://onlinelibrary.wiley.com/doi/10.1111/1756-2171.12115/abstract">study published in October by the RAND Journal of Economics</a>, Ian Ayres and Christine Jolls of Yale Law School and Mahzarin Banaji of Harvard looked at how the race of the seller affected 394 auctions of baseball cards on eBay.

Some of the postings were accompanied by a photo of the card held by a light-skinned hand, and some with the card held by a dark-skinned hand, as in the photos above. The study shows that the cards held by an African-American hand sold for around 20 percent less than the cards held by Caucasian sellers.

In addition, the cards that were held by the African-American hand actually ended up being worth more, suggesting they should have sold for more than the other batch. That is, when the researchers added up how much they had originally paid for all of the cards sold by the black hand versus the white hand, the first total was larger.</p>


Clever experiment design. Depressing result. Clear lesson: hide your hand in eBay photos.
ebay  race  racism 
december 2015 by charlesarthur

Copy this bookmark:





to read