Reducing time-to-value from 14 days to under 10 minutes

2024 · Layering Product-Led Growth onto a Sales-Led Machine

PLGSelf-serve onboardingActivation design

Key Impact

Time to Value

New customers moved from 3–14 days to discovering product value in under 30 minutes — without a sales call.

Activation Rate

Customers who exported a campaign within their first 7 days grew from under 1% to 27% of self-serve signups.

Product-Qualified Leads

Self-serve signups fed a new PQL pipeline back to sales — doubling the volume of product-qualified leads.

Business Context

LeadGenius was entirely sales-led. Every new customer had to book a demo, sit through a sales cycle, and get manually onboarded before seeing any product value. That process took 3 to 14 days.

The number that made the case: less than 2% of website traffic converted to demo signups. The SMB segment had real buying intent but couldn't access the product at all.

My question: could we layer a self-serve motion on top of the existing sales model without cannibalising it? Let users try the product immediately, while feeding qualified behavioural signals back to sales.

The Problem

The website had one path: book a demo. No alternatives.

I audited the full acquisition funnel and mapped where intent died:

  • Site visit to demo page: steep drop-off
  • Demo page to demo booked: under 2% conversion. Scheduling a call was killing intent.
  • Demo to activated customer: 3 to 14 days. By the time users saw value, many had already evaluated competitors.

The product wasn't losing to better features. It was losing to faster access.

Key Insights

  • Access was the barrier, not awareness. Traffic was healthy. Interest existed. But every path to the product required a human intermediary and a multi-day cycle.
  • Sales weren't the enemy — they were the blueprint. Sales and CS were spending hours rescuing accounts that should have been self-serve. The questions they answered and steps they walked users through became the design spec for self-serve onboarding.
  • Time-to-value needed a precise definition. Before designing anything, I worked with the PM and data team to define the activation signal: a user creates a campaign within 7 days of signup and exports data. Without that definition, we'd have been optimising for signups, not value.

Approach

Strategy

One principle guided every decision: earn the right to ask, before you ask.

The sales model worked because a human guided users through decisions they weren't ready to make alone. Self-serve had to replicate that guidance without the human. The product needed to show it understood the user's goal before requesting any configuration.

That one principle shaped the information architecture (intent-first, not settings-first), the interaction model (progressive disclosure by role), the content approach (inline guidance replacing assumed expertise), and how we sequenced experiments.

Constraints

  • Leadership treated PLG as an experiment, not a strategy. Leadership saw PLG as worth exploring, not worth resourcing. I was working with shared engineering, no dedicated growth headcount, and a mandate to prove the model before earning more investment.
  • The sales-led model couldn't break. Enterprise accounts still needed the high-touch path. Every change had to coexist with existing account manager workflows.
  • No dedicated research team. I ran all research myself (funnel audits, Hotjar reviews, CS ticket analysis, live sessions with sales) while designing and shipping in parallel.

Solutions

We also redesigned the website to introduce a freemium signup path alongside the existing demo CTA, removing the hard gate that had blocked self-serve access entirely.

Website redesign comparison - before and after beforeWebsite redesign comparison - before and after after

Experiment 1

With the entry point converted, I moved to onboarding.

Hypothesis: If we reduce signup friction and get users into the product fast, activation will follow. A minimal flow with fewer steps and a lighter configuration will reduce time-to-product and increase activation.

I pushed for this approach. We built a stripped-back onboarding light on questions, light on setup, designed to get users to the dashboard in under two minutes.

Experiment Design
ControlWeeks 1–4
Website has "Book a Demo" CTA only
No self-serve signup option
All activation is sales-led
Baseline activation rate: ~19% of demo bookings
TreatmentWeeks 5–8
Added "Get Started for Free" to website
One-click signup — email and password only
Users land directly in product with zero configuration
No sales involvement until user takes action

What we measured: Total signups via the new self-serve path, self-serve activation rate, impact on demo booking rate (cannibalization check), and total activated users across both paths.

Result: Signups increased, but less than 1% of users did anything meaningful after landing in the product. They got in and got lost.

Learning: A frictionless door into a complex product just moved the confusion from signup to the dashboard. Speed to product wasn't the problem. Users had no idea what to do once they arrived. Reducing friction is a tactic, not a strategy; it only works when users already understand where they're going.

Experiment 2

I went back to the sales team and reviewed how successful customers actually used the product. We needed a personalised onboarding that asked upfront questions to understand intent, then delivered a dashboard experience tailored to each user's job-to-be-done.

Hypothesis: If we personalise onboarding based on role and intent, surfacing only the steps relevant to each user's job-to-be-done, activation will increase because users will reach their specific aha moment faster.

Intent-first, not configuration-first. The old flow opened with settings. The new flow opened with a question: What are you trying to do? The answer determined which steps appeared, what defaults were pre-filled, and what guidance surfaced.

Role-specific paths. A BDR landed on a dashboard pointing directly toward their first campaign. A sales executive was guided toward CRM connection and team-level features.

Personalised checklists over blank dashboards. Each user got a contextual checklist tailored to their job-to-be-done, replacing the empty state that killed Experiment 1. Users always knew what to do next and why it mattered.

Results

Sales and CS reported a drop in rescue calls. Users who previously needed account manager walkthroughs were completing onboarding and creating campaigns on their own.

MetricSales-led / FailedPersonalisedChange
Sales-led activation rate19.1%19.1%Maintained
Self-serve activation rate0.9%26.9%+2,889%
Time to first campaign3.2 days8 mins-99.8%
Onboarding completion100% / 2%100% / 70%+3,400%
Total activated users (vs baseline)200 (+8%)1,401 (+653%)+600%
Product-qualified leads per week6 / 6.530+362%

The PLG motion started as an experiment. It proved itself with activation and pipeline data, and built the internal case for deeper investment.

Business Impact

653% increase in total activated users

186 → 1,401/month. Self-serve achieved 27% activation rate, adding 1,029 net new users without cannibalising sales.

New customers reached first campaign in 8 minutes

Down from 3+ days. Onboarding completion jumped from ~2% to 70%.

8x increase in product-qualified leads

6 → 48/week. Sales-led activation doubled as the team scaled (186 → 372/month).

Learnings

Build a growth model upfront to forecast expected impact. We sequenced experiments by instinct and urgency. Building a growth model first would have let us prioritise by compound effect, not gut feeling.

Run activation and monetisation experiments in parallel. We treated the funnel as stages to fix in order. Testing a conversion moment alongside activation would have shown us where the real leverage was, faster.

Earn the right to ask, before you ask. The product was demanding configuration before users had any reason to trust those decisions mattered. Reversing the order was the difference between under 1% and 27% activation.

Ship the wrong experiment anyway. The frictionless V1 failed fast and made the JTBD case stronger than any upfront analysis could have. The failure wasn't wasted time. It was the fastest path to the right answer.