How Algorithms Can Become as Prejudiced as People

Illustration for article titled How Algorithms Can Become as Prejudiced as People

We're in an era when major decisions are being made by algorithms poring over large datasets. They regulate everything from the stocks we trade, to where we put police officers in cities — and sometimes these algorithms suffer from the exact same prejudices that people do.

Advertisement

In a fascinating essay for Next City, Alexis Stephens writes about how algorithms and big data are not the objective measures of reality we hope they are. The way data is gathered and analyzed can wind up reproducing very human problems of racism and discrimination against the poor. Ultimately we still desperately need human analysts to look at the ethics of algorithms are being used — to see whether they are, in fact, providing us with unbiased data.

Advertisement

On Next City, Stephens writes about how researchers have found several examples of apps that reinforce existing human prejudices:

[Princeton researcher Solon] Barocas' report cites Boston's Street Bump as an example. When smartphone users drive over Boston potholes, the widely acclaimed app reports the location to the city. While inventive, the differences in smartphone ownership across Boston's populations might cause the app to unintentionally underserve the infrastructural needs of poorer communities.

"Historically disadvantaged communities tend to be simultaneously over-surveilled — if you are a part of the welfare system, you have a lot of information being collected by the state at all times — and severely underrepresented, because you might not be an attractive consumer," says Barocas ...

The questions that data miners ask and the way that the results are categorized are extremely important. Barocas brings up an anecdote about Evolv, a San Francisco startup that develops hiring models. In searching for predictors for employee retention, the company found that employees who live farther from call centers were more likely to quit. But because the results also could have an unintentional link to race, Evolv declined to use that information as a caution against violating equal opportunity laws.

"You can use data mining to do something completely different," Barocas points out. "You could ask 'If I adjust workplace policies or workplace conditions, might I be able to recruit or retain different people?'" Rather than blindly using data that might unintentionally discriminate, employers can intentionally reverse prior hiring practices that have adversely affected job candidates based on their race, ethnicity, gender and income.

The more we understand algorithms, the more obvious it becomes that they are just as fallible as human beings.

Read the full story at Next City

Advertisement

Share This Story

Get our newsletter

DISCUSSION

cosineblue
CoSineBlue

And with this article, Analee Newitz and io9 avert the artificial intelligence singularity;)

This issue is already urgent. Educational reforms, school board, superintendents who's jobs are on the line all utilize and cite data in ALL their policy decision. As a former teacher who sees how assumptions made with data translate to policy that impacts the lives of inner city students, this article turned on a light switch for me. Many times programs are implemented in response to low test score data and ethnic and socio economic cohorts. The response by most out to touch administrators to bad data is always to respond intensely and with great pressure on those who have to do the actually job. The data says there a problem there, so we need to go and fix it. The schools with the most alarming data (test scores) get meddle with so badly that the opposite affect of what was intended happens, test score get worse. I'm sure people in other in other industries agree. Data is becoming more vulnerable to quick action than is wise.

Data also does not communicate context or perspective. Or as I like to call it, the humanity in the situation. To educational reforms, the past ten to fifteen years of National educational data looks alarming, but to educators who have been in the profession for a long time, there has always been a natural progression and improvement as time goes on. like to call it, the humanity in the situation.