What Makes Artificial Intelligence Racist And Sexist?

What Makes Artificial Intelligence Racist And Sexist?

Artificial intelligence is infiltrating our daily lives, with applications that curate your phone pics, manage your email, and translate text from any language into another. Google, Facebook, Apple and Microsoft are all heavily researching how to integrate AI into their major services. Soon you’ll likely interact with an AI (or its output) every time you pick up your phone. Should you trust it? Not always.

Photo by Adrian Tombu

AI can analyse data more quickly and accurately than humans, but it can also inherit our biases. To learn, it needs massive quantities of data, and the easiest way to find that data is to feed it text from the internet. But the internet contains some extremely biased language. A Stanford study found that an internet-trained AI associated stereotypically white names with positive words such as “love”, and black names with negative words such as “failure” and “cancer”.

Luminoso Chief Science Officer Rob Speer oversees the open-source data set ConceptNet Numberbatch, which is used as a knowledge base for AI systems. He tested one of Numberbatch’s data sources and found obvious problems with their word associations. When fed the analogy question “Man is to woman as shopkeeper is to…” the system filled in “housewife”. It similarly associated women with sewing and cosmetics.

While these associations might be appropriate for certain applications, they would cause problems in common AI tasks such as evaluating job applicants. An AI doesn’t know which associations are problematic, so it would have no problem ranking a woman’s résumé lower than an identical résumé from a man. Similarly, when Speer tried building a restaurant review algorithm, it rated Mexican food lower because it had learned to associate “Mexican” with negative words such as “illegal”.

So Speer went in and de-biased ConceptNet. He identified inappropriate associations and adjusted them to zero, while maintaining appropriate associations such as “man/uncle” and “woman/aunt”. He did the same with words related to race, ethnicity and religion. To fight human bias, it took a human.

Numberbatch is the only semantic database with built-in de-biasing, Speer says in an email. He’s happy for this competitive advantage, but he hopes other knowledge bases will follow suit:

This is the threat of AI in the near term. It’s not some sci-fi scenario where robots take over the world. It’s AI-powered services making decisions we don’t understand, where the decisions turn out to hurt certain groups of people.

The scariest thing about this bias is how invisibly it can take over. According to Speer, “some people [will] go through life not knowing why they get fewer opportunities, fewer job offers, more interactions with the police or the TSA…” Of course, he points out, racism and sexism are baked into society, and promising technological advances, even when explicitly meant to counteract them, often amplify them. There’s no such thing as an objective tool built on subjective data. So AI developers bear a huge responsibility to find the flaws in their AI and address them.

“There should be more understanding of what’s real and what’s hype,” Speer says. “It’s easy to overhype AI because most people don’t have the right metaphors to understand it yet, and that stops people from being appropriately sceptical.

“There’s no AI that works like the human brain,” he says. “To counter the hype, I hope we can stop talking about brains and start talking about what’s actually going on: It’s mostly statistics, databases and pattern recognition. Which shouldn’t make it any less interesting.”


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments