Design

Auditing AI-generated design imagery for bias and stereotypes

Use AI image generation with controlled prompt variations (e.g., swapping pronouns) to reveal biased “default” representations, then adjust how/if generated visuals are used in design reviews and published design communications.

Why the human is still essential here

A designer must notice and interpret biased patterns, decide what is acceptable to publish, and take responsibility for inclusive representation; the model only generates outputs and can inadvertently encode stereotypes.

How people use this

Pronoun-swap prompt grid

Generate the same “design review” scene multiple times while only swapping pronouns (she/he/they) and compare how authority, age, clothing, and roles shift to surface stereotyping before using the image in a deck.

Midjourney

Inclusive persona stress test for marketing renders

Create controlled sets of hero images across gender, age, skin tone, and visible disability for the same product-and-user scenario, then select/iterate prompts until representation matches your inclusion goals.

Adobe Firefly (Photoshop)

Batch tagging and skew summary report

Run a batch of generated people imagery through vision analysis to produce a quick summary of demographic/role patterns (e.g., who appears as “leader”) so a designer can flag biased trends for revision.

Stable Diffusion / ComfyUI / GPT-4o

Community stories (1)

LinkedIn

INDUSTRIAL DESIGNERS: AI is not neutral, and that’s a problem

INDUSTRIAL DESIGNERS: AI is not neutral, and that’s a problem

———

I tried to generate an image about confidence in presenting. After getting some questionable early results, I tried one subtle shift. The only thing I changed was the pronouns.


The outputs were not subtle. The “she” version skewed younger and styled for attractiveness, dressed in a way that would feel out of place in a real design review. The “he” version skewed older, more covered, and read as the person leading the room.


That is a real problem. These models do not just generate people. They generate defaults. And a lot of those map straight onto stereotypes we keep pretending are gone.


If you use AI images in your work, this is not a side issue. You can ship bias without meaning to, then reinforce it every time you hit generate.


Design is what you put in front of other humans. If the machine keeps encoding old stereotypes into new work, and we keep publishing it, we are choosing to normalize it.

BH
Blair HastyIndustrial Design Director
Mar 1, 2026