How a design team ships a new onboarding with 12 participant interviews
A 6-person design team, one month to ship a new onboarding. They interviewed 12 participants with AI moderation and landed 30% higher Day-1 activation.
This is an example workflow — not a real customer story. Honne is new; real customer outcomes are just starting to land. This illustrates how the product fits into real teams' afternoons.
The setup
He leads design at a consumer productivity app — six designers, one researcher who also runs ops, one PM who owns growth. The numbers that matter: Day-1 activation — the share of new signups who complete at least one meaningful action in the first twenty-four hours — sits at thirty-eight percent and has been flat for two quarters. Leadership has set a one-month deadline to redesign and ship a new onboarding. The research budget is five hundred dollars.
Five hundred dollars doesn't buy an agency study. It doesn't buy a proper moderated-interview panel. It does buy a month of Honne and enough gift cards for forty-ish participants, which is what the team decides to work with.
What they did
Week 1: twelve AI-moderated interviews. He segments the recent signup list: eight users who activated successfully in their first day, four who signed up and never came back. The lapsed users are the interesting cohort, but the activated users are the control — without them there is no contrast. Honne's moderator runs fifteen-minute interviews. Each participant answers a small set of open prompts: "Tell me about the moment you decided to sign up." — "Walk me through the first screen you saw." — "What made you keep going, or stop?" The moderator follows up on specifics. Interviews are async — participants complete them on their own schedule — which is what makes twelve interviews in one week possible for a six-person team.
Week 2: synthesis afternoon. The whole design team, plus the PM, block a Thursday afternoon. Honne's synthesis view has auto-clustered the twelve interviews into six themes. They read the themes together, pull individual quotes to back each one, and vote — dot-vote, five dots each, on which three themes to act on. The three that win: (1) the "invite your team" screen feels premature, (2) the tutorial is universally skipped but tooltips later are well-received, (3) users don't realize they need to configure anything to get value.
Week 3: design sprint. Three days on the new onboarding mock, built around the three themes. No invite-team screen until after the first activation event. Kill the tutorial, double down on in-context tooltips. Add a one-screen configuration step the user can't skip, phrased as "set up your first project" instead of "configure your account."
Week 4: validate before shipping. A five-second test on the new flow's three key screens (forty-two participants, same-day recruitment from their beta list) and a preference test against the current flow. Seventy-six percent prefer the new flow. First-impression answers consistently identify "setting up a project" as the first thing they'd do — which is exactly what the new flow wants them to do. Ship on day twenty-eight.
What they learned
The interview signal is where the decisions lived.
"The screen where it asked me to invite teammates before I'd used the product — I closed the tab there. I had no reason yet to believe anyone else should be here."
Six of twelve participants described abandoning the flow at the invite screen. The team had built it as an early step because "teams that invite teammates activate better" — which was true in the data, but the causal direction was wrong. Teams that activate invite teammates; showing the invite screen early doesn't create activation, it just filters out the not-yet-convinced.
Implication: move invite-teammates to after the first activation event. Don't ask for commitment before delivering value.
"I skipped the tutorial. I always do. But the in-app tooltips later — those I noticed."
Eleven of twelve participants said a version of this. The team had spent engineering time on a sophisticated guided tutorial. Nobody watched it. Meanwhile, the simple tooltips that appeared contextually — a week into the team's backlog, never prioritized — were where people actually learned the product.
Implication: delete the tutorial. Invest the freed engineering time into more contextual tooltips. The honest answer the interviews gave was: users teach themselves if you let them, and refuse to be taught if you insist.
"I didn't realize I needed to set up anything. I thought I could just start using it."
Four of twelve participants, all from the lapsed cohort, described bouncing off because the empty state was literally empty. Nothing was broken — the product was waiting for them to create content — but there was no cue that the next move was theirs.
Implication: make the configuration step explicit and unavoidable, framed as a productive first action rather than as setup.
What they shipped
The new onboarding ships on day twenty-eight — the last Friday of the month-long deadline. The team doesn't call it a finished design; they call it the first version. They watch Day-1 activation closely over the next two weeks.
Two weeks after ship, Day-1 activation moves from thirty-eight percent to forty-nine percent — an eleven-point absolute lift, which is roughly thirty percent relative. The PM writes it up in the quarterly report as the single most impactful design shift of the quarter. Nobody on the team pretends twelve interviews are a rigorous sample size; they're also, visibly, enough. The team keeps doing a twelve-interview round at the start of every major design initiative from that month forward.
The researcher — who had been running ops — gets her researcher title back full-time, funded by the Day-1 activation lift.