How I Work with AI in UX
Over the last couple of years, AI has become a steady part of how I design and collaborate. Not as a replacement for craft, but as a way to increase clarity, speed up iteration, and strengthen decision-making.
I use AI throughout the UX lifecycle. Early on, it helps me shape research questions, identify gaps in flows, and pressure-test assumptions before concepts reach cross-functional review. During synthesis, it supports clustering feedback, structuring insights, and reframing findings for different stakeholders. It allows me to spend more time interpreting and prioritizing, and less time formatting.
In Figma, I use AI to accelerate exploration. That includes generating structured content for layouts, refining microcopy, thinking through complex states, and defining component logic before building full systems of variants. It is especially helpful when designing conditional behaviors or edge cases, where clarity in logic matters as much as visual polish.
In collaboration with engineers, AI helps me articulate interaction logic more clearly and explore implementation scenarios before handoff. I have also used code-generation tools to prototype lightweight functional concepts, which creates more grounded, productive conversations with development partners. Instead of debating abstractions, we can react to something tangible.
I approach AI as a guided collaborator. I frame the problem carefully, review every output, and shape it to align with product goals and user needs. The value is not in automation alone, but in leverage. I can explore more options, surface risks earlier, and move from insight to execution with greater efficiency.
Ultimately, judgment, taste, and accountability remain with me. AI simply expands the range and depth of what I can examine within a given timeframe, which strengthens both the UX process and the outcomes it supports.