AI-powered R&D—vibecoding, taste, and the evolution of full-stack design

Perceptron's Founding Designer offers five lessons on leveraging AI in the R&D lifecycle—from prototyping to deployment.

We've noticed a fascinating paradox across our portfolio companies: while AI is supercharging workflows and being adopted at record rates, there are vast gaps in knowledge around harnessing these tools effectively. Because of this, two teams may experience wildly different outputs in productivity despite adopting the same tools — meaning AI-native development becomes more than just adoption, but rather rethinking entire workflows and instrumenting so AI has context across code base design.

That's why we’re breaking it all down in an AI-powered R&D guide, starting with design: to cut through the noise with tactical insights from companies deploying these tools successfully. We sat down with Cedric Ith, Founding Designer at Perceptron AI, to walk through and showcase his design process—from prototyping and vibecoding to the hand-off with full-stack engineers.

For the complete discussion, check out the full video. We also pulled five key takeaways from how Cedric’s design process has transformed over the past year, and examples to inspire your teams.

1. Taste is a moat. Design thinking is the new superpower.

In an era where AI can generate functional code in seconds, the differentiator isn't who can build—it's who knows what to build. Cedric pointed out that we're entering a world where "software is abundant and barriers to creation are low." This abundance paradoxically makes design thinking more critical than ever.

The traditional technical moats are eroding. When anyone can spin up a functional prototype with natural language prompts, the competitive advantage shifts to those who can identify the right problems and craft elegant solutions. Product sense, UI/UX excellence, and execution velocity become the primary drivers of success.

New algorithms allow designers to explore a wider range of design concepts that may not have been considered otherwise, and in less time than manual methods. Generative design is becoming increasingly present in modern workflows, automating the creation and evaluation of multiple design options set by user-defined parameters.

But this isn't just about making things pretty—it's about deeply understanding user needs and translating them into experiences that feel inevitable. The companies that win will be those that combine AI's implementation power with exceptional taste and design judgment.

2. Natural language is a new design interface.

The most profound shift we observed in Cedric's workflow was his transition from traditional design tools to natural language as a primary design medium. "The critical skill is no longer writing code but how well you can articulate your thoughts and ideas and changes to the AI," he explains.

This represents a fundamental development in designer skillsets. A key skill emerging is what some call "design vocabulary," or the ability to speak fluently in the language of modern frameworks, CSS properties, and technical concepts without writing the code. During the demo, we watched Cedric use precise terminology—"4 pixel corner radius," "opacity of 0.2,” "hover interactions"—to communicate with v0. Complex interactions like drawing bounding boxes with coordinate tracking—previously requiring days of engineering work—could be prototyped in minutes through conversational prompts. As Cedric emphasizes, designers need to "speak the language rather than write the code yourself” by maintaining clarity, consistency, and shared terminology when prompting.

Clarity

Break down complex requests into simple, actionable steps or use specific wording to describe exactly what you’re trying to do. 

OK prompt 

“Add different labels to the boxes in white text.”

Great prompt 

“For each bounding box drawn on the image, add a unique label in the top-left corner of the box that displays its index number (e.g., Box 1, Box 2, etc.) in bold, white text.”

Consistency

Use consistent language when referring to UI elements or features. When, for example, calling a feature something specific like “segment mode” in one prompt, continuing to use that exact term in all subsequent instructions is key.

As Cedric puts it, “Keep track of the names that you give for different elements that you’re referring to.”

Shared terminology

Reinforce shared terminology for more complex prototypes and workflows. Cedric intentionally introduces terms to v0 early in the process. This mirrors how teams use shared vocabularies in traditional workflows to avoid misunderstandings. By teaching AI tools preferred terms and reusing them, it’s easier to build, iterate, and ensure that everyone—human and AI alike—is “speaking the same language.”

The designers thriving in this environment share a common trait—porosity and capacity to learn. They're comfortable moving between tools, quickly adopting new interfaces, and continuously expanding their technical vocabulary without necessarily becoming engineers themselves. As Cedric demonstrates moving seamlessly between Figma, V0, and Cursor, the future belongs to designers who can fluidly navigate an expanding toolkit. 

3. AI is driving the rise of the "design engineer" role.

We’re also witnessing the dissolution of traditional boundaries between design and engineering. Cedric's workflow demonstrates this perfectly: starting in Figma, prototyping in V0, and making final adjustments directly in the codebase using Cursor.

This isn't just about efficiency—it's about fundamentally changing how products are built:

  • Closed-loop ownership: The design engineer operates across the entire stack, ensuring "a closed loop system" where design intent translates seamlessly to production. As Cedric put it, "I can contribute and add PRs to our code base directly. It's a closed-loop system that’s an amount of control I've never been able to experience as a designer."
  • Static mockups are obsolete: The traditional linear handoff model—where designers create static mockups and throw them over the wall to engineering—is rapidly becoming obsolete. Instead, we're seeing collaborative workflows where designers share both high-fidelity Figma designs and functional v0 prototypes that demonstrate real interactions. Engineers receive not just pictures, but working code they can integrate and build upon.
  • Rapid iteration: The back-and-forth dance of design reviews and implementation fixes compresses from days to hours. Cedric can make styling adjustments directly in the codebase rather than annotating screenshots for engineers.

This shift has profound implications for team structure and hiring. The most effective teams will blend design and engineering skills, allowing both designers and engineers to go up and down the stack—contributing to code or building prototypes. Companies that embrace this hybrid model will move faster and build their products with taste more effortlessly.

4. Four AI-native design principles are emerging.

As AI applications proliferate, the early formation of design principles specific to AI-powered products is also taking place. Cedric outlined several key considerations that didn't exist in traditional software design.

Minimize cognitive load.

The best AI interactions feel like "talking to another person" who can subconsciously pick up on context cues. Designers want to avoid excessive hand-holding or complex configuration—let the AI do the heavy lifting of understanding intent. Tools like Recall AI or Granola exemplify this by seamlessly ingesting conversational data and insights without requiring users to structure their thoughts beforehand.

Embrace non-determinism and handle interruptions.

Unlike traditional software, AI systems can produce varied outputs. Design interfaces that gracefully handle this variability and give users appropriate controls when needed will be key. We’ve seen AI companies like OpenAI use Temporal for long-running multi-step tasks that require robust retry mechanisms for non-deterministic execution paths. Users also need the ability to stop or redirect AI processes that go off track. As Cedric noted, watching AI "derail" without recourse creates a "poor experience." Both Cursor and v0 offer checkpoint and revert mechanisms to travel back through execution trees and retry different approaches.

Use AI that shows its work.

While AI models may be black boxes, context and reasoning should be transparent. Perplexity excels at this with comprehensive source citations, and Deepseek was among the first to display its multi-step reasoning process. Anthropic has also done great work interpreting chain-of-thought traces to build towards transparent and explainable AI systems. Companies that want to build user trust will be transparent with chain-of-thought reasoning or source attribution.

Design for supervision, not operation.

In an agentic future, users become managers of AI systems rather than direct operators. This requires new patterns for monitoring, directing, and coordinating multiple AI agents—potentially even OS-level dashboards for agent management. Early UX patterns include notifications for background task completion (e.g., Perplexity Deep Research), parallel execution and progress bars (e.g., OpenAI Codex), and interactive user prompting with generative forms (e.g., Perplexity Comet).

These principles will continue to evolve, but teams that thoughtfully consider them today will build more intuitive and trustworthy AI experiences.

5. In the AI era, velocity is everything.

The pace of change is breathtaking. As Cedric observed, tools and best practices evolve so rapidly that "maybe after this discussion, v0 won’t be the best tool anymore." This creates a new imperative: building organizational muscle for rapid experimentation and adoption.

From our conversation, we hypothesize that the companies thriving in this environment will share common traits:

  • Permitting teams to experiment with new tools
  • Prioritizing shipping and learning over perfection
  • Building modular, API-driven architectures that can quickly incorporate new capabilities
  • Cultivating cultures where learning velocity is valued as highly as current expertise

For larger organizations, Cedric's advice is practical: start with prototyping. Even if you can't touch production code, use AI tools to create compelling prototypes that demonstrate possibilities to build organizational buy-in.

The acceleration isn't just about individual tools; it's about compound effects. When designers can prototype faster, engineers can implement faster, and teams can iterate faster—the entire product development cycle compresses dramatically.

Cedric's AI design stack

Tools Workflow

Figma

  • Still the source of truth for visual design and layout 
  • Export specific frames for AI prototyping
  • Limitations: Can't handle dynamic interactions, coordinate tracking, or state management

v0, Lovable, Bolt.new  

  • Export frames from Figma Import via Figma integrations or MCPs
  • Add dynamic interactions with natural language
  • Example: Bounding box drawing with coordinate tracking

Cursor, Windsurf

  • Direct styling adjustments ("add 16px corner radius")
  • Add dynamic logic ("number boxes sequentially")
  • Submit PRs for engineering review

Shad.cn, Tailwind, UntitledUI, HeroUI

  • Pre-built components that AI can reference by name
  • Speak their language: "Use Shad.cn toast component" or "Apply Tailwind opacity-20"
  • Configure in vO's project settings for consistent code generation
  • Reduces AI hallucination by providing standard component patterns


For more tactical insights on building AI-powered products, check out the full webinar recording with Cedric Ith. And if you work at a tech company and have a unique workflow you’d like to showcase, we’d love to hear from you—reach out to bnagda@bvp.com to share your story.