The Investment That Keeps You Stuck
In 2011, Lidl, one of Europe's largest supermarket chains, began one of the most ambitious technology projects in its history.
The German retailer set out to replace its legacy inventory management system with SAP, the enterprise software platform used by thousands of large organizations worldwide. The project was expected to modernize how Lidl tracked stock, managed pricing, and coordinated its supply chain across hundreds of stores.
Seven years and approximately €500 million later, Lidl cancelled the project and went back to the system it had tried to replace.
The technical failure was real: SAP's system valued inventory at purchase price while Lidl's operations ran on retail price, and adapting the software to fit proved far more complicated than anticipated. But the more revealing detail isn't the failure: It's the seven years of attempting to make it work.
The warning signs came early. The incompatibility between how SAP worked and how Lidl operated was identifiable well before the full cost had been committed. But each year of investment made the next year harder to abandon. Walking away meant admitting that the previous year's spending had been for nothing.
So they kept going — until €500 million made stopping unavoidable. That’s in action.
Past spending shouldn't drive future decisions (but it often does)
In economics, a sunk cost is money already spent that cannot be recovered regardless of what happens next. The rational position is that sunk costs should play no role in future decisions. What matters is what comes next, not what's already gone.
In practice, people reliably violate this principle. The psychologists Hal Arkes and Catherine Blumer demonstrated this in their 1985 paper "The Psychology of Sunk Cost" — people consistently continue courses of action they would otherwise abandon simply because they've already invested in them. The more they've invested, the harder they find it to stop.
The mechanism is emotional, not logical. Stopping means accepting a loss as final. Continuing preserves the possibility, however faint, that the investment will eventually be redeemed.
is sometimes also called the Concorde Fallacy, after the supersonic airliner that both the British and French governments continued funding for decades despite clear evidence it would never be commercially viable, largely because neither could face writing off what they'd already spent. Two names for the same thing: the inability to stop spending because you've already spent too much.
The thirteen years of notes nobody could leave behind
Evernote launched in 2008 and spent several years as the dominant note-taking application for professionals who needed to organize large amounts of information across devices.
Then, gradually, it got worse.
Performance declined. The free tier became increasingly restricted. Pricing rose. Competitors like Notion and Obsidian arrived with capabilities Evernote hadn't matched. By the early 2020s, the technology press had largely concluded that Evernote had lost its way.
Users kept paying for it anyway.
Not all of them — Evernote's decline in market position was real and documented. But a significant cohort of users who understood the product's problems stayed far longer than the product's quality warranted. Their reasoning was consistent: they had years of notes in there. Thousands of them. Tagged, organized, accumulated over a decade of daily work.
Moving to Notion or Obsidian meant exporting, converting, and rebuilding an organizational structure they'd spent years developing. The notes themselves were portable. The system wasn't — the tagging logic, the notebook hierarchy, the search habits, the muscle memory of how everything was arranged.
One long-time user, writing about his eventual switch in 2022 after using Evernote since around 2009, described his account as "a junk drawer I couldn't quite face cleaning out." He had known for years that the platform had deteriorated. The accumulated investment was what kept him there.
Sunk costs compound
What makes the fallacy particularly durable in tool decisions is that the investment stacks in layers, each one making the next harder to abandon.
The first layer is money — licenses, implementation, consulting fees. That's the visible cost and the one teams cite most often.
The second layer is configuration — the automations, integrations, custom workflows, and naming conventions built on top of the tool over months or years. This represents time and ingenuity that can't be exported.
The third layer is expertise — the fluency people develop, the shortcuts they've memorized, the reputation they've built as the person who knows how the system works. That expertise is genuinely valuable and entirely non-transferable.
The fourth layer is organizational memory — the distributed, undocumented knowledge of "how things work around here" that lives in the heads of dozens of people and can't be migrated with a CSV file.
When these stack together, the stated reason for not switching almost always sounds like this: "We've put too much into it to walk away now." That sentence is stated plainly. The investment is real. The conclusion — that prior investment is a reason to continue — is not.
Kodak invented the future and chose to ignore it
In 1975, a 24-year-old electrical engineer at Eastman Kodak named Steve Sasson built the first handheld digital camera. The device weighed eight pounds and took 23 seconds to capture and display an image on a television screen.
Kodak patented the technology in 1978 but shelved the product. Sasson was reportedly told not to speak publicly about it beyond the patent process. There was no public disclosure of the prototype until 2001.
The company's business was built entirely around chemical film and processing. Its manufacturing infrastructure, retail relationships, brand, and revenue were all organized around the assumption that photography meant film. Kodak understood what digital imaging could eventually do. It also understood what digital imaging would do to the business it had spent decades building.
The sunk cost wasn't a number on a spreadsheet. It was an entire industry — factories, supply chains, distribution networks, a global brand built on a single technology. Walking away from film to pursue digital photography meant writing off everything Kodak had become.
Competitors who hadn't made the same prior investment built the digital camera market that Kodak had invented and abandoned. Kodak filed for bankruptcy in January 2012. Sasson received the National Medal of Technology and Innovation in 2009 for the invention his employer had shelved for a quarter century.
The question that cuts through the cognitive bias
The fix is simple to state and genuinely difficult to execute in practice.
Stop asking: "Given everything we've invested, should we continue?"
Ask instead: "If we were starting fresh today — no prior investment, no existing system, no history — would we choose this?"
That question removes the accumulated weight of prior spending from the calculation and forces a direct comparison between the current option and available alternatives on their actual merits.
For a team evaluating whether to stay with a tool they've used for five years, it becomes: if you were a new team starting tomorrow, with the current landscape of available tools and no legacy to protect, would you choose this one?
Often the answer is clear. The sunk cost is what keeps teams from asking the question in the first place.