Design

Auditing component and guideline inconsistencies

She used AI-supported extractors and validators to compare what existed in Figma and React, surface mismatches, and flag inconsistencies the team had stopped noticing.

Why the human is still essential here

Design system owners interpret the inconsistencies, decide the correct source of truth, and resolve architectural problems that validators alone cannot understand.

How people use this

Figma-to-code drift checks

AI compares Figma components with React or Storybook implementations to surface missing variants, renamed props, and other drift.

Figma / Storybook / Claude

Documentation contradiction scan

AI reviews design-system documentation to flag conflicting guidance, outdated examples, and duplicated rules across sources.

ChatGPT / Claude

Token mismatch reports

AI checks whether code is using the same color, spacing, and typography tokens defined in the system’s source of truth.

Tokens Studio / GitHub Copilot

Related Prompts (4)

Community stories (1)

Medium
6 min read

I wanted AI to do my job. It didn’t.

I felt it and couldn’t resist. I decided to try and actually connect our Design system to AI.

Over two weeks, I connected an AI coding assistant to our Design System and React component library via MCP. When I hit technical walls and I hit plenty I used a second AI tool to help me understand what went wrong, structure my requests, and check outputs. One tool to build, one to think with.


Trying to make AI work with our design system forced me to look at what was actually there. And what wasn’t. We also proved some real capabilities along the way.


AI could extract guidelines scattered across different places and reformat them into a consistent structure.


It could flag inconsistencies we’d stopped noticing.


It could build simple patterns using only existing components and tokens.

AM
Anna MorozovaUI/UX designer at HENNGE
Apr 6, 2026