JTBD’s Bad Reputation

Jobs To Be Done has been pronounced dead more times than I can count. VCs dismiss it. Product managers roll their eyes at it. Sriram Krishnan of a16z once said "no successful company was ever built on ."

I’m a huge proponent of . But even I have to admit, the naysayers aren’t entirely wrong.

The core insight—that people "hire" products to make in their lives, not to own features—remains one of the most clarifying ideas in product thinking.

But guru and author Clayton Christensen handed the world a telescope, and consultants immediately got to work turning it into a microscope. They added steps. They created certifications. They invented proprietary terminology. They turned "understand what your customers actually want" into a six-month engagement with deliverables.

The telescope lets you see the big picture. The microscope slices and dices and renders the components of unrecognizable and unusable.

What we have now isn't . It's the corpse of , reanimated as billable hours.

And yet. As AI reshapes how humans interact with technology, from commanding interfaces to delegating intent, the original insight becomes more important than it's ever been. Yes, the bloated version of deserves skepticism. But in the AI gold rush, the original lens is essential. When everyone can add a copilot, generate a workflow, or automate a task, the advantage shifts to products that understand behind the prompt.

The question isn't whether is valuable, it's whether we can separate the gold from the garbage. That’s the question this article tries to answer. Spoiler alert: Yes, you can most definitely separate the wheat from the chaff.

Each article in this category unpacks a foundational concept — that run in every hiring decision, /fire metaphor, how switching actually works, why the core insight matters more now with AI than it ever did before, and more.

But none of it is going to be actionable or useful unless you can tell the difference between the lens and the consulting circus that grew up around it. That's what this piece is for.

The Objections Are Valid. The Conclusions Aren't.

"JTBD Is Too Abstract"

The standard complaint: teams emerge from workshops with statements like "Users want to feel confident in their decisions." True, but useless.

But here's the thing. The abstraction isn't a flaw in the lens. It's a flaw in how practitioners focus it.

was never meant to give you all the answers. It was meant to reframe the questions you ask.

When you look through a lens, you stop asking "What features should we build?" and start asking "What's the that triggers someone to look for a solution?" You stop asking "Who is our target user?" and start asking "Who is desperate enough to switch?"

In AI products, that reframing gets sharper. You stop asking "Where can we add AI?" and start asking "What intent is the user expressing, what are they trying to make, and where would automation, augmentation, explanation, or human judgment actually help?”

The abstraction is the point. It forces you up a level, out of feature-think, into motivation-think. The problem is when teams treat the abstraction as the output.

Most practitioners teach teams to end where they should begin.

"JTBD Kills Bold Innovation"

The argument: Steve Jobs didn't run interviews before inventing the iPhone. Truly disruptive products create markets that customers can't imagine. constrains you to incremental improvements on existing jobs.

This criticism relies on a myth: the visionary innovator who peers into the future and builds what nobody knew they wanted.

It's a compelling story. It's also wrong.

Jobs wasn't a prophet. He was unusually skilled at discerning unmet needs in the present. People already wanted to stay connected, be entertained, capture moments, and signal taste. They were cobbling together solutions with separate devices—phones, iPods, cameras. The iPhone bet was that technology had reached a point where one device could do all of those jobs better than the fragmented alternatives.

That's not vision in the mystical sense. That's understanding present needs and seeing where current solutions fall short.

Visicalc’s creators watched a professor laboriously erasing and recalculating figures on a chalkboard. They asked: what if there was an easier way to do this job? That observation—not a vision of the future—led directly to the first spreadsheet, released in 1979 for the Apple II.

doesn't constrain you to incremental improvements. It grounds you in what people actually need. The form of the solution can be radical. But the need is always present-tense.

The dangerous part of the "visionary" myth is that it gives product teams permission to ignore what's in front of them. "Steve Jobs didn't do user research" becomes an excuse to skip the work of understanding real needs. But Jobs was obsessively attentive to how people used things—he just didn't call it .

The innovators who succeed aren't the ones who see the future. They're the ones who see the present clearly.

"Nobody Agrees on What JTBD Actually Is"

Ask five experts what is and you'll get five different answers.

Christensen focused on understanding why customers make the choices they make. Consultant Tony Ulwick has built an entire methodology around quantifying desired outcomes. Author and product strategist Alan Klement argues that jobs are primarily emotional. Consultant Bob Moesta emphasizes the forces that and pull people toward change. Each has proprietary terminology. Each has certified practitioners. Each claims the others are doing it wrong.

The standard response is that this fragmentation is unfortunate but the core insight survives. That's too generous.

For consultants who profit from it, the fragmentation isn't a bug—it's a feature.

Without a single definition, every consultant can claim expertise in the "real" . With proprietary terminology, organizations must hire interpreters. With complex methodology, only trained facilitators can run the workshops.

This isn't accidental. It's how consulting economics work. Simplicity is a threat. If could be explained in an afternoon, there'd be no six-figure engagements.

This fragmentation also lets practitioners dodge accountability. When doesn't produce results, the methodology is never at fault. You just used the wrong version. You didn't go deep enough. You needed more interviews. You should have hired a certified expert.

The insight is simple: understand the people are trying to make. The industry built around it is deliberately complex—because complexity is what gets purchased.

"JTBD Research Is Too Expensive and Slow"

The framework version of is expensive: ninety-minute interviews, dozens of participants, months of synthesis, multi-phase research programs.

But expensive research often produces worse insights than cheap research.

When you invest six figures and six months in a initiative, you need it to produce something proportionally impressive. This pressure biases you toward complexity. Simple insights feel like inadequate return on investment, so teams keep digging until they've produced elaborate job maps and forced diagrams that justify the expense.

Meanwhile, a founder who talks to ten customers in two weeks—really talks to them, about what they were doing before, what triggered them to look for something new, what almost stopped them from switching—often understands better than the team that spent six months on methodology.

The insight comes from curiosity and the right questions, not procedural rigor. -as-framework substitutes process for curiosity. It creates the appearance of thoroughness while distancing teams from customers.

The best thinking happens in ongoing conversations, not research phases. It's a continuous orientation, not a project with deliverables.

"JTBD Ignores Business Reality"

The criticism: might reveal customer jobs, but real decisions involve strategy, technical constraints, competitive dynamics, and business model requirements.

This is true, and it points to something advocates rarely admit: has almost nothing to say about most product decisions.

Should we build for this job or that job? doesn't answer. Can we profitably serve this job? doesn't answer. Is this job strategically important? doesn't answer. What's the right technical approach? doesn't answer.

illuminates demand. It says nothing about supply.

When organizations treat as a comprehensive product methodology, they inevitably discover it leaves most questions unanswered. They either abandon it entirely or relegate it to theater—research performed to appear customer-centric while real decisions happen elsewhere.

The honest positioning is that is one input among many. It tells you what people are trying to accomplish. Whether and how you should help them accomplish it involves a completely different set of considerations.

The Framework Lifecycle (And Why It's Inevitable)

Every useful business insight follows the same trajectory.

Someone discovers a genuinely clarifying idea. Early adopters find it valuable because they understand what it's for and what it's not for. Then consultancies see opportunity. They add structure, create certification, build elaborate processes. Organizations adopt the framework without understanding the principles. Leaders mandate it. Teams go through the motions. Critics point out it doesn't work. The methodology gets abandoned. The original insight disappears with it.

Design Thinking went from "understand humans before designing for them" to a parody of post-it notes and forced ideation. Agile went from "respond to change over following a plan" to ritualistic standups that make teams slower. Six Sigma went from "reduce defects through measurement" to bureaucratic compliance theater.

followed the same path. "Understand what people are trying to make" became a consulting industry that obscures rather than clarifies.

The pattern persists because organizations want certainty. Frameworks promise certainty. "Follow these steps and you'll understand your customers." The promise is false, but it's what gets purchased.

Real understanding comes from thinking clearly about problems. You can't proceduralize that. You can only practice it.

Why UX Escaped (Mostly)

UX provides an interesting counterexample.

A decade ago, UX faced similar skepticism. Fluffy. Hard to quantify. A luxury for rich companies. Today it's universally valued.

The difference: UX became something you can point to.

"This app has great UX" is a statement anyone can make. You don't need to understand heuristics or information architecture. You feel it when something works well.

UX transcended methodology by becoming a quality attribute. It stopped being something you "do" and became something products "have." The discipline still exists—people still study and practice UX—but the value is understood through experience, not explanation.

never made this transition. You can't point at a product and say "this has great ." The concept remained trapped in methodology, accessible only to those who've learned the terminology.

is an attempt to close that gap. Not by turning into another methodology, but by tying the lens to the experience people can actually feel: whether the product understands their situation, reduces the struggle, creates confidence, and helps them make .

Why JTBD Matters More Now Than Ever

For fifty years, humans have translated their goals into computer commands. Click this. Type that. Navigate here. Fill out this form. Select from these options.

AI inverts this. Instead of humans adapting to systems, systems adapt to humans. Instead of translating intent into commands, you express intent and the system figures out how to fulfill it.

This is thinking made literal.

When you tell an AI assistant "prepare me for tomorrow's board meeting," the system must understand . Not "open calendar" or "find documents"—those are tasks. is "walk in prepared and confident." The AI has to infer what that means: review the agenda, surface relevant materials, identify potential questions, summarize recent developments.

is job recognition. The entire field of AI interaction design is, whether it knows it or not, applied . This creates an interesting situation. The methodology may be dying of consultant bloat, but the underlying insight is becoming foundational to how software works.

Every AI agent is designed around a job. Customer service bots that work are designed around jobs like "get my refund" or "track my package." They fail when designed around vague goals like "assist customers." Productivity agents succeed when focused on discrete jobs: "schedule this meeting," "summarize this thread," "draft a response."

is the unit of delegation between human and machine.

As AI capabilities expand, the question becomes: Which jobs should humans do, which should AI do, and which require collaboration? You can't answer that question without understanding jobs.

The Lens, Not the Framework

This brings us to the critical distinction.

  isn't something you implement. It's something you see through.

The difference matters. Frameworks are procedures. They have steps, outputs, and completion criteria. You can do them wrong. You need training. They create experts and novices.

A lens is a way of looking. You either see through it or you don't. There's no "doing it wrong" because there's nothing to "do."

When you look through , you ask different questions. Not "what features should we build?" but "what is someone trying to make?" Not "who is our user?" but "what situation triggers someone to seek out a new way?" Not "what do customers want?" but "what would make customers fire their existing status quo?"

You don't need a consultant to look through a lens. You don't need certification. You need the habit of asking: what's here?

Good designers already think this way. They work backward from outcomes. They ask "what is this person trying to accomplish?" before designing screens. They think about , motivation, and success criteria. They just don't call it .

What Changes When You Stop Implementing and Start Seeing

When becomes a lens rather than a framework, several things shift.

You stop researching and start noticing. The best insights come from ongoing attention, not periodic research. Every customer conversation, every support ticket, every sales call, every user session—each is an opportunity to notice what job someone is trying to do.

You stop documenting and start . The output of thinking is clarity about what to build and why. If the thinking doesn't change decisions, it's not thinking. It's theater.

You stop debating methodology and start iterating. "Are we doing right?" is the wrong question. "Do we understand what our customers are trying to make?" is the right question. The first invites process arguments. The second invites customer contact.

And most importantly: you stop confusing the tasks with .

The real power is recognizing that while technology changes, remains stable.

"Get the image I have in my head onto the screen" used to require understanding layers, masks, and selection tools. It was a skilled activity that took years to master. Now, with AI, you can just describe what you want.

didn't change. is exactly the same. What changed is that the drudgery required to do collapsed.

"Understand my sales performance" used to require SQL queries and dashboard building. Now it can be "tell me what's happening with sales."

-as-framework tends to get stuck mapping the old tasks ("Step 1: Open Photoshop"). -as-lens sees the enduring job ("Create the image") and immediately recognizes how AI can shortcut the path to the outcome.

That's the clarity that matters now.

The Path Forward

The objections to are valid. The methodology has become a consulting grift. The terminology obscures more than it clarifies. The frameworks produce theater instead of insight.

But the insight underneath—that people hire products to make , that understanding that is the key to building things people want—remains essential.

provides the shared language you need to get buy-in.

No, it’s not a methodology.

And it’s not yet another made-up framework.

gives you a reference that codifies how successful teams already blend Jobs-to-be-Done with UX. It turns what works into clear language the whole team can finally understand and use.

Think of it like a Rosetta Stone for your business. That makes it possible to align product, design, and marketing around what truly matters: helping users succeed at they hired you for.

What JTBDUX Is

  • A lens for seeing products and experiences through the question: what is someone trying to make?
  • An integration of JobsTo Be Done thinking with UX practice. Not as separate disciplines, but as one way of working. You understand () and you design the experience of getting it done (UX). They're inseparable.
  • An orientationand a habit of asking: What's here? Who's hiring this product, and for what? What would make them fire it? What does actually look like for them?
  • A  It helps you see what people are struggling with, why they switch to something new, why they stay, and why they leave. Understanding present needs is how good products get built in the first place, not just how they get validated after.
  • A way to speak a common language. It names how things already work. Humans have jobs. They hire products to help. The quality of the experience depends on how well the product understands and fulfills . This isn't a theory. It's just what's happening. And everyone can understand it.

What JTBDUX Is Not

  • A framework. There are no steps to follow, no phases to complete, no certification to obtain, no "doing it right." You either see through the lens or you don't.
  • A crystal ball. It doesn't predict the future or prescribe exactly what form your solution should take. It grounds you in present needs—what people are struggling with now, what current solutions fail to do. The form of your solution still requires conviction, timing, and taste. But the need it addresses should be real and observable.
  • A research methodology. You don't need 90-minute interviews or six-month initiatives. You need the habit of noticing what job someone is trying to do. Every customer conversation, every support ticket, every user session is an opportunity.
  • A consulting engagement. If you need expensive experts to explain it, we've failed. The lens should be intuitive. Look at any interaction and ask: What job is being done? By whom? How well?
  • New terminology for its own sake Good designers already think this way. They work backward from outcomes. They ask what someone is trying to accomplish before designing screens. just names that way of thinking so it can be discussed and refined.
  • A silver bullet. It's one input among many. It illuminates demand but says nothing about supply—strategy, constraints, competitive dynamics, business model. Those require different thinking.

So yes, in many ways, deserves its bad reputation. The framework does, anyway. But the lens? The lens is how we see clearly enough to build things that matter.