INDUSTRIAL DESIGNERS: AI is not neutral, and that’s a problem
INDUSTRIAL DESIGNERS: AI is not neutral, and that’s a problem
———
I tried to generate an image about confidence in presenting. After getting some questionable early results, I tried one subtle shift. The only thing I changed was the pronouns.
The outputs were not subtle. The “she” version skewed younger and styled for attractiveness, dressed in a way that would feel out of place in a real design review. The “he” version skewed older, more covered, and read as the person leading the room.
That is a real problem. These models do not just generate people. They generate defaults. And a lot of those map straight onto stereotypes we keep pretending are gone.
If you use AI images in your work, this is not a side issue. You can ship bias without meaning to, then reinforce it every time you hit generate.
Design is what you put in front of other humans. If the machine keeps encoding old stereotypes into new work, and we keep publishing it, we are choosing to normalize it.