A new study has revealed how deeply ingrained stereotypes can be perpetuated by AI, impacting recruitment, pay equity, and career aspirations.
Prompts for CEO’s, doctors, engineers, carpenters, electricians, manufacturing workers, and salespeople produced exclusively male results. Meanwhile, images of women were entirely generated when inputting the job titles of housekeeper, HR manager, marketing assistant, receptionist, and nurse.
The only job that AI programmes repeatedly generated both men and women for was a teacher. However, when asked to produce content about a headteacher, the outputs reverted to including men only.
The study, which was conducted by employee rewards, recognition and benefits platform, Rippl, found that it’s not just harmful gender stereotypes AI is conforming to. In the same study, race and age were clearly considered when generating both images and text.
When CEOs were generated, they were exclusively white, middle-aged men, whereas young white women were generated under the job title of marketing assistant. Looking at deskless industries, manufacturing workers were generated solely as young men of colour, and housekeepers were all young women, some mixed race.
Chris Brown, CEO at recognition, rewards and benefits platform Rippl says, “AI can be a great tool when it is set up correctly and teams are trained in its proper use. However, we also know AI is susceptible to bias, and this study demonstrates just how widespread this problem is when it comes to workplace stereotypes.
“The impact these enduring biases can have in the workplace is huge. If AI is used to screen CV’s or identify potential candidates, its bias can lead to qualified individuals, particularly women and minorities, being overlooked. This can perpetuate existing pay inequities, where women and minorities are often paid less for the same work. Studies have also shown that stereotyping is still having an impact on children’s career aspirations.
“While AI undeniably has its use in the workplace in supporting and streamlining working processes, this experiment is a stark reminder of its limitations. Its inherent bias and lack of regulation means it simply cannot be a replacement for humans – and people practices must remain human-led to ensure all talent is accurately represented, seen and heard, and celebrated for their contribution.”