Contact us

If you're interested in partnering with us to improve outcomes, or offering Pomelo as a benefit to your employees or members, we'd love to hear from you.

For candidates interested in joining our team to reinvent maternity and neonatal care, please apply directly to the open roles listed on our Careers page.

hello@pomelocare.com

How AI turned the codebase into my new design tool

Written by
Elise Livingston
February 19, 2026
Share this post

Like most people working in tech, I’ve spent the past year or two experimenting with various AI tools and trying to make them fit into my work. As a product designer, I’ve used AI to sketch layouts, explore color palettes, generate content and assets, and build prototypes. I’ve found AI to be a helpful way to overcome the ‘blank page problem’ when starting a new project, but other than that, it hasn’t really transformed my process. And it definitely hasn’t made me a ‘faster’ designer, as promised.

In this post, I’m sharing how I’ve been using AI as a product designer to better understand our codebase. At Pomelo, we’re encouraged to explore how AI can fit into our workflows. So I stopped trying to use AI as a creative tool. Instead, I started to discover the actual ways that AI can fit into my workflow; the ways that AI can help me do things I couldn’t do before, and make me a stronger and more informed designer. 

Educating myself about the codebase

A huge part of design work is understanding the system you’re designing within. At Pomelo, our products are technically complex. It’s very common for designers to have only a surface-level understanding of the products they work on. If you want to know something about how the code works, you either have to interrupt an engineer or keep guessing. “Is this question worth taking someone out of their flow?” is a pretty high bar to clear if you’re just curious about something.

The biggest ah-ha moment in my AI journey was when I finally set up Claude Code and connected it to my team’s codebase. Suddenly I had a search engine and a translator all at once. 

I started asking Claude Code my questions. “How many variations of our app onboarding exist?” “Do we already have a banner for alerts? How does it work?” “What are our current type styles?” Not only was Claude incredible at answering these questions in a clear and understandable way, but I found myself asking more questions. I’m learning about our products more in depth and more quickly than ever, and that helps me be a better design partner for my team.

Here’s how I connected Claude Code to my team’s codebase:

  • I cloned our mobile code repository from GitHub to my local machine
  • I installed Claude Code, which lets me chat with AI inside the terminal
  • I set up our mobile development environment, so that I can run our mobile app locally using a simulator

In action, this setup looks something like this: I was working on updating the type styles in our mobile app, and I was trying to understand how we were using our current styles. I was able to use Claude to output all of the places we use each type style, and I asked Claude to add a text label next to every string in the app with the style name. This gave me a clear map of what was being used where. It revealed where we had systemic inconsistency and where maybe we just needed to swap out a style. 

That kind of visibility used to be very challenging for me to get as a designer. I’d have to make a lot of assumptions, connect a lot of dots, and then bring it to an engineering partner to have them correct me. Now, I can learn on my own and show up with a clearer understanding of the problems to be solved and how to solve them. And I’ve been able to share my learnings with the rest of the product design team so they can benefit, too. 

The end of guesswork: visualizing systemic changes

Figma is my favorite design tool. I spend most of my waking hours staring at it. However, Figma is not the source of truth for a product. The code is.

When something looks different in production, it’s usually not because someone “didn’t follow the design.” It’s because there are dozens of variables, defaults, and edge cases that only exist in code. Those relationships are invisible in a static design file or even an interactive prototype.

AI has made it possible for me to explore those relationships directly. If I want to understand how a style or layout variable affects the app, I can ask Claude to trace where it’s used across the codebase. I can see every place that variable appears and get a sense of how a single change might ripple across multiple screens.

I can also test changes in the simulator to see how they behave in real contexts. That feedback loop has completely changed the way I evaluate my own work. I don’t have to wait until something is built to discover that a small tweak causes unexpected issues elsewhere.If that isn’t a super-power for a designer, I’m not sure what is.

Bridging the gap between design and engineering

Immersing yourself in your product’s codebase and boosting the quality and feasibility of your designs transforms relationships with your engineering partners.

When I propose a design change now, I can explain it using the same language that our codebase uses, not just pixels and intention. I can identify where something has drifted from the system and suggest a fix that works within existing patterns. The conversations I have with engineers are faster and more focused because we’re talking about the same system, not comparing screenshots.

This kind of shared understanding builds trust. Engineers see that I understand their constraints and that I care about the integrity of the system. And I see the complexity they manage every day, which makes my designs more pragmatic and grounded.

The more I do this, the more convinced I am that designers should own more of this middle layer—the place where visual intent meets system logic. The devil is always in the details, and someone has to make sure those details are consistent, accessible, and scalable. Design is in an excellent position to take that ownership, and AI tools can help make it possible.

Shifting from output to understanding 

As a designer, my process depends on exploring a problem space visually. Writing prompts to describe problems and solutions always feels like skipping over the part where real thinking happens.

AI can generate convincing outputs, and it can be an incredible tool for those who don’t think visually to communicate their ideas. But it often misses the nuance of what makes good design. And if you have something really specific in mind, there’s always some issue with the output that takes longer to fix than if you had just designed it from scratch.

Most of the hype around AI in design has focused on that output, on creativity, and on speed: generating mockups, brainstorming ideas, and automating tasks. It hasn’t lived up to that promise for me. Instead, its value has been much more foundational.

I’ve come to see AI as a tool for literacy. It helps me read the product I’m designing. It helps me understand how components are structured, how styles are applied, and how the system behaves at scale. That literacy makes me a better designer, because my decisions are grounded in how things actually work. The result isn’t necessarily faster design, but better-informed design.

Explore careers at Pomelo

Share this post

Stay Connected

Stay in the loop - Sign up now to receive the latest news, research, and exciting stories directly to your inbox.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
AICPA SOC
All clinical services are provided by licensed physicians and clinicians practicing within an independently owned and operated medical practice, Pomelo, P.C. or affiliated professional corporations. Pomelo Care, Inc. does not provide any medical, nursing, or other healthcare provider services.