Racial, sexist bias may sneak into artificial intelligence systems: study

Researchers have demonstrated how machines can be reflections of us, in potentially problematic ways

Racial, sexist bias may sneak into Artificial Intelligence systems: study
artificial intelligence
Press Trust of India Washington
Last Updated : Apr 14 2017 | 5:42 PM IST
Artificial Intelligence systems can acquire our cultural, racial or gender biases when trained with ordinary human language available online, scientists including one of Indian origin have found.

In debates over the future of artificial intelligence, many experts think of the new systems as coldly logical and objectively rational.

However, researchers have demonstrated how machines can be reflections of us, their creators, in potentially problematic ways.

Also Read

Common machine learning programs, when trained with ordinary human language available online, can acquire cultural biases embedded in the patterns of wording, researchers found.

These biases range from the morally neutral, like a preference for flowers over insects, to the objectionable views of race and gender.

Identifying and addressing possible bias in machine learning will be critically important as we increasingly turn to computers for processing the natural language humans use to communicate, for instance in doing online text searches, image categorisation and automated translations.

"Questions about fairness and bias in machine learning are tremendously important for our society," said Arvind Narayanan, assistant professor at Princeton University in the US.

"We have a situation where these artificial intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from," said Narayanan.

Researchers developed an algorithm GloVe, which can represent the co-occurrence statistics of words in, say, a 10 -word window of text. Words that often appear near one another have a stronger association than those words that seldom do.

Researchers used GloVe on a huge trawl of contents from the World Wide Web, containing 840 billion words.

Within this large sample of written human culture, researchers then examined sets of target words, like "programmer, engineer, scientist" and "nurse, teacher, librarian" alongside two sets of attribute words, such as "man, male" and "woman, female," looking for evidence of the kinds of biases humans can unwittingly possess.

In the results, innocent, inoffensive biases, like for flowers over bugs, showed up, but so did examples along lines of gender and race.

The Princeton machine learning experiment managed to replicate the broad substantiations of human bias.

For instance, the machine learning program associated female names more with familial attribute words, like "parents" and "wedding," than male names.

In turn, male names had stronger associations with career attributes, like "professional" and "salary."

This correctly distinguished bias about occupations can end up having pernicious, sexist effects.

The study was published in the journal Science.
*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

First Published: Apr 14 2017 | 5:31 PM IST

Next Story