Don’t Wait Until It’s Too Late To Design for the Emotional Job
On October 1, 2013, the U.S. government launched Healthcare.gov. The was clear: help Americans enroll in health insurance under the Affordable Care Act. Millions of people needed coverage. The site existed to get them there.
What happened is now a case study in government technology failure. The site crashed almost immediately. By the end of the first day, only six people had successfully enrolled. The budget ballooned from $93 million to over $1.7 billion. Congressional hearings followed. The Secretary of Health and Human Services resigned.
Most of the post-mortems focused on the technical failures. Overloaded servers. Bad architecture. Poor testing.
But there was another failure running underneath all of it.
The people who needed that site most were already scared. They were uninsured, possibly sick, navigating a complex system they didn't understand, trying to do something consequential they'd never done before. The wasn't "enroll in a plan."
It was "feel like the system is actually going to help me."
Every error message, every crash, every unexplained delay told them the opposite.
The functional failure made the emotional failure catastrophic.
What the emotional job layer actually is
When someone hires a product, they're always hiring it to do more than a mere task. They're hiring it to make them feel something. Or to stop feeling something.
is the internal experience the user is seeking—or trying to avoid—while gets done. It runs alongside the , not beneath it. And it often matters more.
"Enroll in a health plan" is the . "Feel like I'm going to be okay" is the .
You can solve the completely and still fail the emotional one. The form loads correctly. The user still sits there wondering if they picked the right plan, whether this will actually cover what they need, whether they can trust any of it.
That's a failed product, regardless of what the completion rates say.
The emotional Job hiding in a $2,500 exercise bike
Peloton launched in 2012 and spent years being written off as overpriced hardware. A $2,500 bike. A $44 monthly subscription. For exercise you could do anywhere.
The —"get exercise"—could be served by a $300 stationary bike and a YouTube video.
Peloton wasn't selling exercise. It was selling the feeling of being an athlete and belonging to a group.
The leaderboard, the instructor calling out your name, the output metrics, the milestone badges, the community of riders—none of that makes the workout more efficient. All of it makes you feel like someone who trains, not someone who exercises because they have to.
That distinction is everything.
"Someone who trains" is an identity. "Someone who exercises because they have to" is an obligation.
People pay a significant premium to feel like the first one. Peloton's marketing understood this before most competitors did. Their campaigns weren't about features—they were about transformation. The was made explicit: this is who you become.
At its peak in 2020, Peloton had a waitlist of hundreds of thousands of people. The hadn't changed. The was pulling hard.
When you design for the emotional job without serving the functional one
Robinhood is a cautionary tale. It launched in 2013 with a genuine at its core: help ordinary people feel like they have access to investing. Not just the wealthy. Not just people with brokers. Anyone.
The was "feel like a smart investor. Feel like you're in control of your financial future." That pulled in millions of users. The zero-commission trading was real, the interface was clean, and the emotional promise of democratized access landed.
But the design went further than that.
Robinhood added confetti animations when trades were completed. Digital trophies for activity. The visual and emotional language of a game.
Those design choices served the —"feel like a smart investor"—while actively undermining the functional one: "make good investment decisions."
In 2024, Robinhood paid $7.5 million to settle action from Massachusetts regulators, who found that the gamified features encouraged young, inexperienced investors to take on more risk than they understood.
The had been served. The hadn't. That gap created real harm.
Emotional jobs are often about avoiding a feeling, not gaining one
A lot of thinking focuses on the positive: what users want to feel. But are frequently defined by what people are trying to stop feeling.
Credit Karma is a good example. The is straightforward: check your credit score. But for most people — especially anyone who's missed payments, carried debt, or been denied credit — the is closer to "find out where I stand without the dread of what I'll see."
That's a fear job, not an aspiration job.
The features that matter most in Credit Karma aren't the ones that make score-checking faster. They're the ones that address the specific anxiety people bring into the experience: showing the score immediately instead of making you wait, explaining in plain language what's helping and hurting, displaying a trend line so you can see whether things are getting better.
The "factors affecting your score" breakdown is doing emotional work — it turns a single intimidating number into something you can understand and act on.
Credit Karma could have designed a product that simply displayed a number and left. Instead, they designed around the : make someone who's nervous about their finances feel like they have a handle on it.
That's why 100 million people check their score through a free app instead of paying for one of the older monitoring services. The is identical. The emotional experience is completely different.
The emotional job changes by context
The same product can be hired for different depending on who's using it and when.
Duolingo is a useful example.
For some users, the is "feel like I'm making real on something meaningful." The streak mechanic, the XP, the league tables serve that job—they create a sense of forward motion and discipline.
For others, the is closer to "not feel guilty about the thing I keep saying I'll do." The owl notifications ("You haven't practiced today!") are famous—and divisive—because they're targeting exactly that: converting low-grade guilt into action.
Same app. Different . Different users responding to different design elements.
When Duolingo went public in 2021, they disclosed that daily active users and streak metrics were central to how they measured engagement. That's because the of "feel like I'm the kind of person who follows through" is what keeps people coming back day to day.
The —learn a language—takes months or years to deliver. The has to be served daily or people stop showing up.
B2B has an emotional job as well
B2B teams often treat as solely a consumer phenomenon. It isn't.
The person choosing an enterprise analytics platform is still a person. They have an running alongside the functional one.
Usually it's something like: "Don't be the one who made the wrong call."
The decision-maker who selects a tool that disappoints is exposed. They championed something. They made everyone learn a new system. If it fails, that's on them.
The is "make a decision I can stand behind if it goes sideways."
That's why case studies matter so much in B2B—not because they prove the product works, but because they give the buyer cover. "Other teams like mine did this and it went fine" is work. The buyer needs to feel safe making the bet, not just convinced of the value.
How to find the emotional job
The surfaces when you ask the right questions.
- "How did you feel when you realized you needed to solve this?"
- "What would it mean for you personally if this went wrong?"
- "When it's working well, how does it feel to use it?"
- "What's the version of this experience you'd dread most?"
That last question is particularly useful. The people most urgently hire for are often about avoiding something: embarrassment, regret, anxiety, loss of control, looking incompetent.
Products that name those feelings precisely—not generically—create Pull that goes well beyond features.
Healthcare.gov didn't just need faster servers. It needed a confirmation screen that said: "You're enrolled. Here's your plan. Here's what it covers. Here's who to call if you have questions."
Same database. Twenty more words. The , served.