Design

Conditioning AI on design system context and tokens for consistent, on-brand output

Feed AI your design tokens, component logic, structured guidelines, component-library context, and shipped product patterns so it can generate more consistent UI outputs and reusable patterns aligned with an existing design system and brand standards.

Why the human is still essential here

A human designer must create and maintain the design system foundation, decide what context and rules matter, validate the extracted patterns, and ensure brand nuance and consistency. AI can follow structure more effectively when the system is explicit, but it does not create reliable coherence on its own.

How people use this

Token-aware UI generation constraints in prompts

Teams paste/export their token names and spacing rules into prompts so AI proposes layouts that reference approved colors, typography scales, and spacing steps.

ChatGPT / Claude

Single source of truth token pipeline to code

Design tokens are managed in a dedicated token tool and exported to engineering formats (CSS variables, Android/iOS tokens) so AI-assisted output and implementations stay aligned.

Tokens Studio for Figma / Style Dictionary

Design system documentation and guardrails for AI-assisted work

Design system guidelines (component usage, do/don’t, anatomy) are centralized so designers can quickly check AI-generated components against documented standards before shipping.

zeroheight / Specify

Related Prompts (4)

Community stories (5)

Medium
6 min read

I wanted AI to do my job. It didn’t.

I felt it and couldn’t resist. I decided to try and actually connect our Design system to AI.

Over two weeks, I connected an AI coding assistant to our Design System and React component library via MCP. When I hit technical walls and I hit plenty I used a second AI tool to help me understand what went wrong, structure my requests, and check outputs. One tool to build, one to think with.


Trying to make AI work with our design system forced me to look at what was actually there. And what wasn’t. We also proved some real capabilities along the way.


AI could extract guidelines scattered across different places and reformat them into a consistent structure.


It could flag inconsistencies we’d stopped noticing.


It could build simple patterns using only existing components and tokens.

AM
Anna MorozovaUI/UX designer at HENNGE
Apr 6, 2026
LinkedIn

How I actually use AI in product design.

How I actually use AI in product design.

Most teams are using AI to move faster.

That’s not the hard part.

The hard part is not making the product worse in the process.


Here’s how I actually use AI in product design right now.


1. I use AI for structure, not decisions:


Layouts, variants, responsive states.

Anything repetitive.


That used to take hours. Now it takes minutes.

But I’m not asking it what the experience should be.

That’s still on me.


2. I treat the first output as a draft, not the answer:


AI gives you something that looks right.

That’s the danger.

Spacing might be fine. Flow might be off. Edge cases are missing.


If you don’t know what “good” looks like, you’ll ship it anyway.


3. I stay close to the system:


Design systems matter more now, not less.

Tokens, constraints, patterns.

If the system is solid, AI outputs improve.


If it’s loose, you just get cleaner-looking chaos.


4. I use it to explore, not finalise. It’s great for:


• Trying directions quickly

• testing layout approaches

• getting out of a blank state

• But the final 20% still needs taste.


That hasn’t changed.


The shift isn’t that AI is designing for you.

It’s that it’s removing the parts of design that were never the point.


Less time pushing pixels.

More time deciding what actually matters.


Most people will use this to skip thinking.

The advantage is using it to think better, faster.


Curious how others are actually using this in real workflows — not just the flashy demos I'm seeing on socials 👍

PO
Paul OsbornPrincipal Product Designer
Apr 5, 2026
X

My AI Design Toolkit 2.0

Six weeks ago I wrote about the AI tools I use as a designer. That article hit 470K views. People bookmarked it, shared it, and sent me DMs asking for more details.

Since then, almost everything about how I work has changed.


Not the tools themselves. Most of those are still in my toolkit. What changed is how I use them. And more importantly, where I use them.


My first article was about side projects. Building apps, fixing styling, restyling Framer sites. Fun stuff. Personal stuff. The kind of work where "close enough" is fine and you can iterate until it feels right.


But I have a day job. I'm a product designer at a large ecommerce company. We have a design system. We have a pipeline. We have four platforms: iOS, Android, desktop web, mobile web. Every pixel has to match. Every component has to follow the system. Every change goes through review.


That's a very different environment for AI.


And for a while, I couldn't make it work there.


The problem with prototyping at work


At work, I can't just open Claude Code and say "build me something cool." There's an approved design. There's a design system with tokens, components, and naming conventions that took years to build. There are developers who need to implement whatever I hand them.


When I tried to bring my side project workflow into this environment, it fell apart.


I wrote about this a few weeks ago. I had an approved design in Figma. I set up an Xcode project, connected it to Cursor, pointed it at my Figma file, and told it to recreate the design in SwiftUI.


The MCP wouldn't connect. That cost me 15 minutes.


When it finally worked, the output looked nothing like my design. The structure was sort of there, but the spacing was wrong, the colors were wrong, and the whole thing felt like Cursor had seen a blurry photo of my design and made its best guess.


I spent 45 minutes trying to wrangle it before giving up entirely.


The issue wasn't the tools. The issue was that AI doesn't know your design system. It doesn't know your spacing tokens, your color primitives, your component hierarchy. It just guesses. And when you're working within tight production constraints, guessing isn't good enough.


So I had to solve that problem first.


Teaching AI your design system


This is the thing that changed everything for me.


I had Claude Code extract the essence of our design system. It used the Figma Console MCP to read through our component library, tokens, and styles.


Then I had it browse our live websites in both desktop and mobile views. Not just look at them. Actually document what it saw. The layout patterns. The spacing. How things respond on different screen sizes. How the navigation works. What the cards look like. Everything.


The output was a single markdown file. A design.md that captures our visual language: the spacing scale, the typography, the color system, how components are structured, how things behave across breakpoints.


It's not a full design system spec. It's more like a cheat sheet. Enough context for an AI tool to understand "this is what their product looks like" without having to parse the entire Figma file every time.


Here's why this matters.


I work at a company with a strict design pipeline. I can't burn all my Claude Code tokens trying to get AI to work inside Figma directly. But if I can extract the design system once and carry that context with me, every other AI tool I use becomes smarter.


Now when I plug that file into Variant AI or Google Stitch, the output is close to our actual product right from the start. Not perfect. But close enough that I'm tweaking details instead of rebuilding from scratch.


Before this, every AI-generated design looked generic. Like it came from a template. Now it looks like it belongs to our product. That single markdown file was the unlock.


From design to production prototype


After my success building native SwiftUI animations for our mobile app a few weeks ago, I wanted to try the same approach for our web product.


The process was faster than I expected.


I gave Claude Code the link to our frontpage and the Figma design files for the feature I was working on. I told it to build a React prototype that matched our current design.


Since our design is relatively simple and clean, it didn't take long to have the foundation up and running. The design.md file helped here too. Claude already understood our visual system, so it wasn't guessing at spacing or typography.


Once the foundation was solid, I started layering on the stuff that actually matters for testing.


Stagger appear animations for content loading. A transition animation using barba.js between pages. That page transition does two things: it hides our loading time, and it adds a bit of polish that makes the whole experience feel more considered.


I have a background in motion design. Broadcast work, cinemagraphs, HBO. So when it comes to animation, I know what I want. The hard part was always getting it into production. Now I can build it myself and hand developers code they can actually use.


This prototype now serves as a foundation. Not just for the feature I built it for, but for testing new ideas on top of it.


Want to try skeleton loading states? Drop them in. Want to experiment with a different navigation pattern? Build it on top of what's already there. No more sketching it out in Figma, building a clickable prototype, presenting it, getting approval, then watching developers rebuild it from scratch.


The prototype is the starting point for the real thing.


A quick note on Paper


I also gave Paper a proper run during this period. It's fast. Really fast. The iteration speed is impressive and the output quality is genuinely good.


For side projects, I'd use it without hesitation. You can go from idea to high-fidelity design in minutes, and the iteration loop is tighter than anything else I've tried.


But at work, our entire pipeline runs through Figma. Developers pull specs from there. Design reviews happen there. Tokens sync from there. Adding another tool to that chain creates friction that nobody wants. So Paper stays in my side project toolkit for now. It's one to watch though.


What's actually different now


Looking back at my first article, I was excited about the "WTF moments." Claude Code fixing my app styling. Remotion making a client video in minutes. The Framer MCP restyling an entire site.


Those were real. But they were also isolated wins. Cool tricks. Party pieces.


What changed in the last six weeks is that AI went from "fun for side projects" to "useful for my actual job."


And the thing that made that possible wasn't a new tool or a better model. It was solving a boring problem: how do you give AI enough context about your product that it stops guessing?


That design.md file sounds incredibly boring. It's a markdown document. There's nothing flashy about it. Nobody is going to make a viral demo video about extracting design tokens into a text file.


But it's the single most impactful thing I've done with AI at work. Because it means every AI tool I use now starts from our reality instead of from a blank slate. Every prototype starts closer to production. Every design exploration looks like it belongs to our product instead of a random template.


The React prototype approach is the same idea. Instead of asking AI to imagine what our product should look like, I'm giving it our actual product as the starting point and building on top of it.


Context is the unlock. Not better prompts. Not fancier tools. Just giving AI the information it needs to do useful work within the constraints you actually operate in.


The honest take


I'm still not using AI for final production design at work. Our design system, our developer handoff, our review process: that all still runs through Figma. That's not changing anytime soon.


But the prototyping and exploration phase? That's completely different now.


I can test an idea in a working React prototype before it ever enters a sprint. I can show stakeholders something real instead of a Figma mockup with hotspot arrows.


And when the developers pick it up, they're starting from code that already works. Not a static spec they need to interpret. Not a prototype video they need to reverse-engineer. Actual working code.


That changes the conversation. It changes the speed. It changes how much you can try before committing to a direction.


Six weeks ago I was saying "WTF" at party tricks. Now I'm shipping prototypes that feed directly into production.


The toolkit didn't change that much. How I think about it did.

f
frederikProduct designer
Mar 24, 2026
Medium
9 min read

Building a Scalable Product System as a Solo Designer — With AI Agents

What two months of working inside Claude Code taught me about design systems, technical negotiation, and the future of IC work

As we entered 2026, my weekly feed has been saturated with forecasts from top-tier practitioners regarding the future of work. These predictions trigger a unique blend of anxiety and excitement — anxiety from the fear of falling behind, but excitement because the barriers to building products that once required an entire team are being systematically lowered.


Based on the insights I’ve gathered, I believe the ‘New IC’ (Individual Contributor) is on the rise. These are not passive freelancers forced into independence by downsizing; they are professionals who actively choose to oversee the entire lifecycle of planning, research, design, and delivery. Their scarcity stems from their T-shaped knowledge structure:


A deep, specialized core, a broad, cross-disciplinary vision, and crucially, the ability to translate their expertise into a language that AI Agents can execute.


This article is a documentation of my journey over the past two months — a transition from ‘using AI tools to speed up tasks’ to ‘co-architecting systems with AI Agents.’ It all boils down to one fundamental question:


Can a designer, empowered by AI Agents, independently build a maintainable and scalable product system?

KY
Kermit YenProduct Designer
Mar 5, 2026
LinkedIn

Everyone is talking about AI replacing designers.

Everyone is talking about AI replacing designers. I think we need to have an honest conversation.

I am not anti-AI. In fact, I think tools like Claude and MCP services have genuinely changed how designers work, automating repetitive tasks, speeding up component generation, and helping early-stage teams move faster. That's real value, and I won't dismiss it.


But let's stop pretending there are no trade-offs.



a𝗧𝗵𝗲 𝗰𝗼𝘀𝘁 𝗿𝗲𝗮𝗹𝗶𝘁𝘆 nobody talks about

Every serious AI workflow runs on credits. Token costs add up fast, especially when you're iterating, generating variants, or running MCP pipelines at scale. Anyone telling you AI is free either hasn't used it extensively or doesn't understand how the credit system works.



𝗔𝗜 𝗻𝗲𝗲𝗱𝘀 𝘆𝗼𝘂𝗿 𝗱𝗲𝘀𝗶𝗴𝗻 𝘀𝘆𝘀𝘁𝗲𝗺 𝘁𝗼 𝗯𝗲 𝘂𝘀𝗲𝗳𝘂𝗹

AI doesn't generate consistency on its own. To get reliable, on-brand output you need to feed it your design tokens, component logic, and spacing rules first. Ironically, you need a designer to build the foundation that makes AI useful for design. For early-stage products, this workflow can be powerful. For large, distributed teams with an established brand identity, the gaps become harder to ignore.



𝗧𝗵𝗲 𝗰𝗿𝗲𝗮𝘁𝗶𝘃𝗶𝘁𝘆 𝗰𝗲𝗶𝗹𝗶𝗻𝗴 𝗶𝘀 𝗿𝗲𝗮𝗹

AI learns from what exists on the web. It recombines, it optimises, it executes, but it doesn't invent. If you want your product to look like every other modern SaaS tool in the market, AI will serve you well. But if differentiated visual identity and brand depth matter to you, AI output will feel generic without significant human direction.


Image and video generation? Impressive. Building a product that feels genuinely personalised and distinct? We are still far from that.



𝗧𝗵𝗲 𝗯𝗼𝘁𝘁𝗼𝗺 𝗹𝗶𝗻𝗲

AI is a powerful tool in the right hands. The designer who understands how to direct it will always outperform the one who blindly follows the hype — or the one who refuses to engage with it at all.


The future isn't AI replacing designers. It's designers who understand AI, replacing those who don't.


What's your experience been working with AI in your design workflow?

PRK
Paresh R KhatriProduct Designer
Mar 2, 2026