Ageism, sexism, classism and more: 7 examples of bias in AI-generated images
This includes everything from the naturalistic (think a soccer player’s headshot) to the surreal (think a dog in space).
- This includes everything from the naturalistic (think a soccer player’s headshot) to the surreal (think a dog in space).
- At the same time, however, these outputs can reproduce biases and deepen inequalities, as our latest research shows.
How do AI image generators work?
- AI-based image generators use machine-learning models that take a text input and produce one or more images matching the description.
- Although Midjourney is opaque about the exact way its algorithms work, most AI image generators use a process called diffusion.
- Read more:
AI to Z: all the terms you need to know to keep up in the AI hype age
How does bias happen?
- Beyond this, however, the model will also have a default tendency to return certain kinds of outputs.
- This is usually the result of how the underlying algorithm is designed, or a lack of diversity in the training data.
- Six months later, to see if anything had changed over time, we generated additional sets of images for the same prompts.
1 and 2. Ageism and sexism
- For non-specialised job titles, Midjourney returned images of only younger men and women.
- For specialised roles, both younger and older people were shown – but the older people were always men.
3. Racial bias
- All the images returned for terms such as “journalist”, “reporter” or “correspondent” exclusively featured light-skinned people.
- This trend of assuming whiteness by default is evidence of racial hegemony built into the system.
4 and 5. Classism and conservatism
- For instance, none had tattoos, piercings, unconventional hairstyles, or any other attribute that could distinguish them from conservative mainstream depictions.
- Many also wore formal clothing such as buttoned shirts and neckties, which are markers of class expectation.
6. Urbanism
Without specifying any location or geographic context, the AI placed all the figures in urban environments with towering skyscrapers and other large city buildings. This is despite only slightly more than half the world’s population living in cities. This kind of bias has implications for how we see ourselves, and our degree of connection with other parts of society.
7. Anachronism
- Instead, technologies from a distinctly different era – including typewriters, printing presses and oversized vintage cameras – filled the samples.
- Since many professionals look similar these days, the AI seemed to be drawing on more distinct technologies (including historical ones) to make its representations of the roles more explicit.
- Otherwise you might unintentionally reinforce the same harmful stereotypes society has spent decades trying to unlearn.