Gender bias seen in AI-generated content on leadership, new research says

Analysing AI-generated content about what made 'good' and 'bad' leader, men were consistently depicted as strong, courageous, and competent, while women were often portrayed as emotional, ineffective

AI, google, Artificial Intelligence
Press Trust of India New Delhi
2 min read Last Updated : Sep 30 2023 | 2:55 PM IST

New research has revealed an inherent gender bias in the content - text, images, other media - generated by artificial intelligence (AI).

Analysing AI-generated content about what made a 'good' and 'bad' leader, men were consistently depicted as strong, courageous, and competent, while women were often portrayed as emotional and ineffective, researchers at the University of Tasmania, Australia, and Massey University, New Zealand, found.

Thus, AI-generated content can preserve and perpetuate harmful gender biases, they said in their study published in the journal Organizational Dynamics.

"Any mention of women leaders was completely omitted in the initial data generated about leadership, with the AI tool providing zero examples of women leaders until it was specifically asked to generate content about women in leadership.

"Concerningly, when it did provide examples of women leaders, they were proportionally far more likely than male leaders to be offered as examples of bad leaders, falsely suggesting that women are more likely than men to be bad leaders," said Toby Newstead, the study's corresponding author.

Generative AI learns the patterns in input data, using which the AI is trained, and then creates content bearing similar characteristics. The AI depends on machine learning concepts for content creation.

For training these generative AI technologies, vast amounts of data from the internet along with human intervention to reduce harmful or biases are processed.

Therefore, AI-generated content needs to be monitored to ensure it does not propagate harmful biases, said study author Bronwyn Eager, adding that the findings highlighted the need for further oversight and investigation into AI tools as they become part of daily life.

"Biases in AI models have far-reaching implications beyond just shaping the future of leadership. With the rapid adoption of AI across all sectors, we must ensure that potentially harmful biases relating to gender, race, ethnicity, age, disability, and sexuality aren't preserved," she said.

"We hope that our research will contribute to a broader conversation about the responsible use of AI in the workplace," said Eager.

(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

Topics :Artificial intelligencegender biasResearch

First Published: Sep 30 2023 | 2:55 PM IST

Next Story