Intercom’s playbook for becoming an AI-native business
Intercom’s SVP of Engineering, Jordan Neill, breaks down their journey with five tactical insights.
At Bessemer, we’re witnessing a profound shift reshaping the SaaS landscape: established companies not just layering on AI features, but fundamentally reimagining their entire business to become AI-native. Yet, the tactical blueprint for this transformation is often elusive.
Intercom stands out as a rare example of navigating this shift with clarity and speed. Before generative AI became mainstream, the company was already experimenting with AI-driven customer support back in 2018. In under two years, the customer support leader transformed itself from a SaaS incumbent into a breakout AI company, with its AI agent, Fin, on track to hit $100 million ARR.
Understanding how to move from legacy SaaS to an AI-first enterprise is the defining challenge — and opportunity — of our time. That’s why we hosted Intercom’s SVP of Engineering, Jordan Neill, to discuss the lessons from their journey and gathered the most important takeaways for founders and executives leading their own AI transformations.
Main takeaways from Intercom’s playbook for AI founders and engineering leaders:
- Design for speed and depth — Centralize AI talent. Organize around mission-critical workstreams with clear ownership.
- Re-architect your foundations — AI isn’t an add-on. Make your codebase AI-native to scale faster and smarter.
- Build ahead of the curve and prove readiness — Prototype internally, validate ruthlessly, and open up early to shape the future with real users.
- Price for impact — Charge for outcomes, not seats. Move from vendor to true partner.
- Hire adaptable builders — Empower product-minded generalists and design engineers. Back them with deep specialists where it counts.
1. Redesign your operating model by centralizing R&D with single-threaded ownership.
Intercom’s transformation began with a major top-down commitment: just days after ChatGPT’s release, the CEO announced that Intercom would become an AI-first company. This decisive move made it clear that the company’s traditional, functionally siloed org structure was no longer fit for purpose. To keep pace with the rapidly evolving landscape, Intercom redesigned its operating model around two key principles:
1. Centralized, high-impact teams
The company centralized critical functions, most notably its AI team. This group expanded rapidly, growing from fewer than 10 to nearly 50 machine learning (ML) researchers and scientists. Rather than dispersing these experts across product teams, Intercom kept them together to foster a culture of deep experimentation and learning. Jordan Neill described the AI group’s success as measured less by “what did you ship this week” and more by “what did you learn this week.” This approach relieves the team of the pressure of constant shipping and allows them to pursue fundamental advances, avoiding the trap of local maxima.
2. Mission-driven, cross-functional workstreams
To execute at speed, Intercom introduced "workstreams"— small, dedicated, startup-like teams of 10-15 people pulled from engineering, ML, sales, marketing, and more across the company. Each workstream is assigned a Directly Responsible Individual (DRI), a single owner from any discipline who is given the autonomy to drive a specific project for weeks or months. This replaces diffuse ownership and ensures focus. Today, Jordan notes, "almost everybody in R&D at Intercom is working on Fin." It’s an intense model, but one they believe is essential to keep pace with the market.
2. Re-architect your codebase for AI-native development — treat it as existential, not optional.
A frequent objection from established companies is that they’re simply too busy to re-architect their codebase for AI. Intercom’s experience makes it clear: this is a false economy. To unlock the true potential of AI, you can’t just layer it on top of legacy systems — you must fundamentally rebuild your foundation to be AI-native.
Intercom’s CTO launched a bold "2X initiative" aiming to double R&D output, measured in pull requests shipped. This wasn't just about adopting tools like GitHub Copilot or Cursor. It demanded deep, structural changes to both the tech stack and development workflows. For instance, the team is in the process of migrating its entire front end from Ember.js to React — not for the sake of trend-chasing, but because, as Jordan put it, “AI is much better at writing React code than Ember code.” The team is also adapting backend systems and workflows to maximize the leverage AI can provide.
The move to overhaul core parts of the product underscores the most critical, and perhaps most difficult, lesson for companies. Being too busy shipping to tackle technical debt is a trap — Intercom proves that rebuilding now unlocks efficiency later. There’s intrinsic value to slowing down if it aids in gaining accelerated speed later. Investing in foundational changes to make your tech stack AI-native isn't a distraction from the roadmap — it is the roadmap for staying competitive. When development velocity is a key determinant of success, a codebase that AI can't efficiently read, write, and refactor becomes a liability, not an asset.
Intercom is even deploying autonomous coding agents that proactively submit pull requests for tasks like removing dead feature flags. This work, Jordan notes, “is similar to the same practices you would do to adjust the code for a new hire." The key difference is that the leverage you gain from AI is now much higher, making the ROI on this foundational work undeniable.
3. Develop ahead of capabilities, then hit the gas when ready.
In a rapidly evolving landscape, building for the AI of today is a losing strategy. Intercom’s success with Fin was driven by a forward-thinking approach, developing ahead of the technological curve while rigorously validating readiness before exposing customers to new features.
Build your "taste tester" first.
Intercom built a sophisticated internal evaluation framework prior to Fin being customer-ready. This "machine for building the machine" included backtesting against historical data, simulating user behavior, and conducting large-scale A/B tests to validate every change against key metrics like resolution rate, customer satisfaction, and hallucination frequency. A rigorous process like this proved to themselves that GPT-4 was mature enough for launch, allowing them to be a launch partner with confidence.
De-risk with internal prototypes.
Internal prototyping has kept Intercom ahead of the curve without exposing its customers to unstable technology. To assess cutting-edge models and ideas, Jordan emphasized, "We build and experiment with it ourselves. It then becomes very obvious when the technology isn’t ready." Their own support team is often their first alpha customer, offering a powerful litmus test: if the team wouldn’t trust the AI in their own workflows, it simply shouldn’t go live.
"Build in public" to engage early adopters and shape the product.
Once a prototype shows promise internally, Intercom then shifts to a "build in public" strategy for rapid market feedback. For example, they’ve adapted their internal Friday show-and-tell sessions into public-facing video content for the fin.ai blog. This isn't about releasing a half-baked product, but about signaling direction, generating excitement, and attracting a motivated group of early adopters and design partners to help shape the future of the product, even when it’s still a work in progress.
4. Align pricing and value to transform the commercial relationship.
AI doesn't just reshape your product; it redefines how you create, deliver, and capture value for your customers. Intercom’s transformation into an AI-native business was matched with a radical overhaul of its commercial model by ensuring that pricing, incentives, and customer success were tightly aligned. The company did this in two key ways:
1. Pricing the outcome, not the tool.
Intercom broke from the traditional SaaS playbook by pioneering an outcome-based model for Fin. Instead of charging per user or seat, customers only pay when Fin autonomously resolves a conversation—99 cents per successful resolution. As Jordan pointed out, “If Fin has to escalate to a human, you don't pay." This was more than a pricing tweak — it was a strategic lever that changed internal behaviors and customer relationships. With outcome-based pricing, every team member from product to sales now has a North Star: resolved conversations. Pricing models based on what AI can deliver have influenced the broader industry adopting AI, with competitors like Salesforce and Zendesk announcing similar models after Fin’s.
2. Shifting from selling to partnering.
The shift to outcome-based pricing required Intercom to rethink its GTM strategy. Jordan likened traditional SaaS sales strategy to "selling a car" — a transactional exchange that leaves more risk in the customer’s hands — versus selling AI, which is more like "teaching somebody to drive" and requires a more hands-on, consultative partnership. Because of this, Intercom introduced forward-deployed engineers who work on-site with customers to ensure successful setup and adoption. This approach recognizes that deploying AI isn’t like flipping a switch; it’s about guiding customers through change management.
5. Hire for versatility, empowering the generalist and design engineer.
The evolution towards AI-native work has changed what it means to build great teams. It’s actively breaking down the rigid silos between disciplines, placing a premium on versatile, product-minded individuals who can own problems end-to-end. Intercom’s hiring philosophy aims to find and empower this talent through three main principles:
1. Prioritize the product-minded generalist — Intercom has long centered on hiring "product engineers" who aren’t just technically astute, but also deeply motivated by customer and business impact. In this new paradigm, these generalists are thriving. They’re empowered to work directly with customers, identify problems, and deliver solutions without direction.
2. Embrace the "design engineer" — As we discovered from unpacking the evolution of full-stack design, the emergence of the "design engineer" is leading a tactical shift. At Intercom, designers are no longer just creating mockups — they’re writing and committing production code. This hybrid role eliminates the bottlenecks of handoffs and ticketing, empowering designers to fix issues themselves as soon as they spot them.
3. Balance generalists with deep specialists — While Intercom’s product teams are built around versatile generalists, the core AI group is intentionally specialized. The company seeks out a specific profile: ML PhD researchers and scientists who also have experience shipping products at scale. This dual approach — empowering a broad base of generalists while fueling them with a central brain of trust of world-class specialists — ensures that the foundational technology advances.
The pattern across all of Intercom’s changes is clear: successful AI transformation requires short-term disruption for long-term advancement. Their experience overall demonstrates that the SaaS leaders who will thrive are those willing to fundamentally rethink their business for AI.
For more insights on building AI-powered products, subscribe to Atlas. And if you're going through a similar transformation, we'd love to hear from you—reach out to bnagda@bvp.com share your story.