Time-to-Value Isn’t a Product Metric. It’s a Job Metric.
is having a moment. Everyone's measuring it. Everyone's "driving it down." Everyone's got a chart that proves it's getting better. And a weird thing keeps happening: the chart improves while the business doesn't.
Adoption doesn't meaningfully change. Retention doesn't lift. Support tickets keep coming. Sales keeps hearing some version of "we couldn't get it to click" from accounts that looked perfect on paper. The mismatch isn't because is a bad metric. It's because most teams are timing the product, not .
Setup Completion Is Not the Same as Making Progress
When teams say "," they usually mean time from signup to setup completion. Time until the checklist hits 100%. Time until the user has clicked the right buttons in the right order and the product declares them "activated." But setup completion and making tangible toward are two different events—and in the best products, they happen at the same time.
The whole point of good onboarding is to give the user a taste of value as early as possible. Canva doesn't make you configure a brand kit or set up a team workspace before you design. You open the app, pick a template, and change the text. You have a professional-grade graphic in sixty seconds. Setup and happen simultaneously—and that's why it works.
The problem is when teams measure setup completion and treat it as proof that value was delivered. "Created a workspace" doesn't necessarily mean the user made toward . "Connected a source" doesn't mean they got the insight they came for. "Invited a teammate" doesn't mean they felt the coordination improve.
Those are steps. They might be necessary steps. But measuring them and calling it "" only works if those steps actually produce a moment where the user feels closer to the outcome they hired the product for. If your metric stops the clock at setup completion, and setup completion doesn't coincide with a moment of job , you've built time-to-product—not .
The question to ask is: does our onboarding deliver a taste of ? And is the event we're timing the moment that happens?
What This Looks Like in Practice
Monday.com's onboarding can get a user to a configured board in under five minutes. Create a workspace. Name a project. Add columns. Invite teammates. The setup checklist completes. : five minutes. The chart looks great. But nobody hired Monday.com to create a board. They hired it because deadlines were slipping and nobody could see what was blocked. is something like: "know what's at risk before it surprises me."
If the onboarding is well-designed, the user should get a taste of that during setup, not after. Maybe the board auto-populates with sample so they can see what "visibility into what's stuck" actually looks like before they've entered a single real task. Maybe importing from their existing tool happens as step one so there's real to work with immediately. Maybe the first thing they see after creating the board is a view that shows status at a glance—not an empty grid.
The metric question is: are you timing "board created" (setup completion) or "user can see what's at risk without asking anyone" ( toward )? The first is easy to instrument. The second is the one that predicts whether anyone comes back. And the gap between them tells you whether your onboarding is delivering a taste of or just running users through configuration.
The Product Clock Is Seductive
Timing onboarding completion is easy. Onboarding has events. Events have timestamps. Timestamps have dashboards. You can make the line go down by shaving steps, tightening copy, auto-filling fields, removing friction, making the checklist feel more smooth. You can "improve " without ever touching the messy part: whether the user can actually get to the outcome they need.
And it changes what the organization optimizes for. When you time onboarding completion, you reward the team for getting people through a maze. You incentivize speed through a corridor, not arrival at a destination. You design for completion behavior instead of behavior, and then you're surprised when people complete the steps and still don't stick.
When onboarding is nothing but setup—configure this, connect that, invite them—without weaving in moments of toward , every step drains the user's motivation. Effort is a form of trust. When a user connects their , invites their team, and configures their workspace, they're making a bet: I'll invest now because I believe you're going to pay me back with .
If they get through all that setup and still haven't felt the product helping them make toward , you've converted their effort into disappointment. You taught them, early, that investing in your product doesn't pay off. That's a trust problem. And the metric should be catching it—by timing the moment of , not the moment of setup completion.
Time This (Not That)
The clock should stop at the first moment the user can credibly say: "This is working. I'm less stuck than I was before." Think of it as the first moment worthy of celebration. Not "I'm set up." Not "I understand the UI." Not "I completed the checklist."
The first real perceived taste of on is the only value event worth timing. And finding it requires you to work backward from , not forward from the product. Start with the situation that triggered . A team doesn't adopt a project tool because they're "collaborative professionals." They adopt because deadlines are slipping and nobody can see what's blocked. A company doesn't buy analytics because they "value -driven decisions." They buy because they're making expensive decisions in a fog.
Name the situation. Name as under constraints—what they're trying to accomplish and what they need to avoid while doing it. Then find the earliest credible marker: the first sign that the user is less stuck.
- For Monday.com, it's not "board created." It's "the user can see real status without asking anyone."
- For Canva, it's not "account created." It's "the user sees their idea rendered in a professional layout."
- For an analytics product, it's not " source connected." It's "the user answered the question that's been driving bad decisions."
- For an automation tool, it's not "workflow built." It's "the annoying manual thing didn't happen today."
Those are job-native value events. They map to the user's life, not your feature set.
When the Gap Is Too Big, the Onboarding Is Broken
If the gap between setup completion and first taste of value is days or weeks, that's not simply a number to report. It's a problem to solve. A user who completes setup on Monday and doesn't make toward until the following week is a user whose motivation is bleeding out. They showed up with urgency—something was broken in their world. Every day between "I'm set up" and "this is helping" is a day that urgency fades and the old way reasserts itself.
The fix isn't to accept a longer number. It's to redesign the onboarding so happens during setup, not after it. What's the smallest slice you can deliver before the user has finished configuring? Can you show them what the outcome looks like with sample ? Can you import their existing information first so there's something real to work with immediately? Can you get them to one moment of "oh, this is going to help" before you ask them to invest further?
That's the design challenge. And the metric should be measuring whether you're winning it—not tracking how long users wait for value to finally arrive.