In business there’s a mound of data trying to tell you something. But like any good mystery, the first explanation is rarely the right one.
A tech company's revenue jumped 40% last quarter. Was it the increased ad spend? The new sales comp? Or the product launch? (Spoiler: asking this question is already a mistake.) In business there’s a mound of data trying to tell you something. But like any good mystery, the first explanation is rarely the right one.
Our brains are wired to spot patterns and make quick connections. It's how we survived on the savannah. See a rustle in the grass? Better assume it's a lion and run. The cost of being wrong was low, but the cost of missing a real threat was death. But in business, this quick pattern-matching leads us astray. We see two things happen together and assume one caused the other. It's like seeing clouds and rain, and deciding clouds cause rain... without understanding the whole water cycle. Let's look at three real-world causality traps that caught smart teams off guard:
Decision: RevTech, a growing SaaS company, raised enterprise prices by 40% after seeing competitors charge more.
Outcome: Total revenue immediately jumped 15%, masking that new customer acquisition fell 30%. They were winning fewer deals but at higher prices.
Mistake: The team celebrated the revenue growth until discovering most of it came from existing customers upgrading due to new product features – not the price increase. Meanwhile churn had started increasing faster than they could bring in new customers – they'd entered a ‘death spiral’.
Lesson: Short-term revenue metrics can hide longer-term market position issues. Always examine both growth and share metrics to understand true causality.
Decision: An e-commerce brand tripled Google Shopping spend, believing their product-level targeting would drive higher conversion rates.
Outcome: Click volume grew 85% but conversion rate dropped 40%. Overall CAC increased despite efficient CPCs.
Mistake: Shopping ads landed users directly on product pages, bypassing category browsing. Analytics revealed most buyers typically compared 4-5 products before purchase. Without easy navigation between alternatives, users bounced rather than hunting for comparison options.
Lesson: Channel performance isn't just about traffic quality – it's about matching user journey stages. What looks like a targeting problem might actually be a UX issue.
Decision: A manufacturer reorganised sales teams by industry vertical to "focus on customer needs."
Outcome: Some verticals saw 25% higher win rates. Others dropped 15%.
Mistake: They blamed the poor performers for "not adapting" – until analysis showed the successful verticals had product features that perfectly matched industry needs.
Lesson: Organizational changes amplify underlying realities, they don't create them.
So how do we get better at understanding true causality? It starts with accepting three uncomfortable truths:
It’s hard, almost impossible to retrofit causality. So instead of asking "what caused this?" start by asking:
The teams that win aren't the ones with the most data or the fanciest tools. They're the ones who've learned to see the whole system, not just its parts.
They've learned that understanding true causality isn't about being right – it's about being less wrong over time.
Ready to start seeing the whole picture?
Think about a recent business decision that had unexpected results. What other factors might have influenced the outcome that you didn't consider at the time? How might those factors have interacted with each other?
The answer might surprise you. But that's the thing about causality – it's always more interesting than it first appears.
Ready to realise the true commercial potential of digital for your business?
Get in touch