New Relic’s Head of Growth Marketing had a simple observation: HubSpot could monitor website performance for marketing teams. Why couldn’t New Relic’s Synthetics do the same job?
His hypothesis was that marketing managers controlled larger tooling budgets than engineering teams but were completely locked out of New Relic’s products by complexity they had no reason to learn. The existing onboarding flows converted at 15-20%, largely because they were built for engineers and assumed a baseline that most marketers didn’t have.
He engaged me to validate the hypothesis and build something.
We partnered with Growth Marketing’s user research team and interviewed marketing managers. What came back was clear: they cared deeply about website performance — it affects SEO, conversion, and brand perception — but existing monitoring tools made them feel like they were in the wrong room. They didn’t want dashboards. They wanted answers.
The product problem wasn’t capability. New Relic’s Synthetics was already mature and powerful. The barrier was presentation. The ceiling on growth wasn’t functionality — it was UX.
Rather than build from scratch, I wrapped Synthetics in a dramatically simplified UI. Three clicks from signup to monitoring. Working with a designer, we built an experience that contextualized page speed data in terms marketers actually cared about — drop-off rates, attention metrics, specific guidance on what to fix and why. I worked with Product Language to ensure the entire customer journey used accessible, outcome-focused terminology.
The Senior Director of Engineering had worked with me on the Instant Observability teams. His engineers had fed back positively about that collaboration, so when this project came up, he volunteered his tiger team in Hyderabad. We made it work across a 12-hour time gap — some late nights on both sides — and shipped v1.
70% signup conversion, against 15-20% for every other New Relic entrance. A 3.5-4.5x improvement with zero new underlying engineering — Synthetics was already there.
The product proved that non-technical users would adopt observability tools given the right framing. Marketing managers were a real market. The data said so clearly.
One week before v2 shipped, the CPO canceled the project. He looked at the conversion data, the customer research, the uptake numbers, and said “I don’t buy it.” The product sat on the shelf.
This case study is included not just for the 70% conversion rate, but for what it demonstrates about the gap between validated customer evidence and executive decision-making. Knowing that gap exists — and knowing how to qualify whether a buyer will actually act on evidence — is part of how I screen engagements now.