
I had the good fortune (for a programmer!) to spend years working alongside professional athletes – British association footballers in particular. If there’s one idea that has stayed with me, it’s that measuring performance is a competitive advantage.
Not vanity dashboards, but real measurement: what are we trying to improve, how will we know it’s improving, and what do we do when it isn’t?
The metrics used by elite athletes look very different from those used by engineers. But once you get serious about studying peak performance, the overlaps are surprisingly practical.
My introduction to football analytics
Around 2006, I visited a startup headed by a charming elderly gentleman I’ll call “Barry.” His company sold analytics software to football clubs of all sizes. Barry walked me through his pitch deck: fitness tests, shot accuracy, performance trends. The data was entered manually, and I was there to explore automation. I had an abundance of ideas, and things were going well.
Then Barry reached a slide that gave me pause:
“Next, you can see the player’s bowel movement statistics of the last 30 days. Unfortunately we couldn’t find a real footballer willing to share their data, so I’ve instead populated the display with real data from my wife.“
I became quiet as I resisted the impulse to suggest ways this data collection could also be automated.
A couple of years later, Barry’s company went out of business. I later met his former UI designer and asked: “Remember how Barry would present his wife’s… uh, digestive stats to investors, sometimes with her standing beside him? Didn’t you ever feel it was in questionable taste?”
The designer calmly replied: “Watching how Barry’s guests reacted to that slide was the best part of our job!”
Hard to argue with that.
The real lesson behind the Barry story
The experience taught me several things. First: some processes are potentially best left unautomated. Second: there are business risks when leadership isn’t tuned into feedback – both the obvious kind (“maybe skip that slide”) and the subtle kind (“what does the room do when we show them who we are?”).
But the most useful takeaway was this: even relatively small clubs were often more serious about performance measurement than regular companies a hundred times their size. Not because they loved spreadsheets, but because they understood something fundamental: you can’t improve what you refuse to look at. In a sport where marginal gains decide careers, refusing to look is a competitive disadvantage.
Plan for the season, not for the next headline

A football club might have an objective to win a championship in two years. The bad strategy is to double training intensity. Tired athletes get injured. Performance collapses. You might get a short spike, but you pay for it later.
A startup might have an objective to become a unicorn in two years. The bad plan is the same: double the workload, live in permanent crunch, and hope exhausted people will produce the most difficult solutions indefinitely. That isn’t ambition. It’s burning the furniture to heat the house.
The better plan is boring and effective: decide what winning means in concrete terms, map the capabilities needed to get there, increase load gradually, and protect recovery time – because recovery is where adaptation happens.
And when you miss a week of good work, you don’t make up for it by working twice as hard before the deadline. You accept reality: the past week is gone. All you can do is execute today well. Every workday is equally important, because outcomes are the result of accumulated days of working on the right things.
The intensity paradox
In football, training load is carefully managed because everyone accepts a basic truth: bodies have limits. Brains do too – they just fail less dramatically. There’s no slow-motion replay of burnout. It arrives quietly: decisions that don’t get made, bugs that slip through, patience that wears thin, and a team that gradually stops caring whether the work is good.
Crunch, when it becomes habit, is rarely about grit. It’s an attempt to purchase time with human energy – and the interest rate is high. You pay in rework, in turnover, in the steady erosion of judgment at precisely the moments you need it most.
This hits hardest in deep-tech, where a small group is solving problems that may not have known solutions yet. A 10-person team at sustainable pace gets roughly 10,000 hours of high-quality thought in six months. Double the hours through crunch and you get 20,000 on paper – but exhaustion doesn’t scale. You just get tired people making tired decisions.
I’ve seen what even 1,000 hours of high-quality thought can do: solve problems the world considered unsolvable. Chase 2,000 through burnout and you won’t get there twice as fast. You might not get there at all.
There’s a subtler loss too, one that never shows up in any burndown chart: the thinking that never happened. The insight that would have arrived in the shower, on the walk home, in the quiet moment between hard pushes. Crunch doesn’t just degrade the work you’re doing – it erases the Eureka moments you’ll never know you missed. It’s borrowing clarity from tomorrow to feel productive today.
But here’s the flip side that gets lost in conversations about sustainability: none of this is an argument for taking it easy. Quite the opposite. If you’re protecting your energy, you’d better be spending it. A sprint drill only works if you actually sprint. Go through the motions at 70% and you’re just getting tired without getting faster.
The same is true for knowledge work. Protecting recovery time isn’t permission to coast through the day. It’s what allows you to push genuinely hard when you’re on. A sustainable pace with low intensity produces nothing worth recovering from. When you sit down to solve a hard problem, you should feel the cognitive load – the slightly uncomfortable sense of operating near your edge. That’s not suffering; that’s stimulus. It’s what triggers growth. Intensity without recovery is burnout. Recovery without intensity is stagnation. You need the reps and the rest.
Set goals like a coach, not a cheerleader

I’ve seen projects where leadership had a weak grasp of what was technically feasible. The result isn’t high standards – it’s misalignment, wasted effort, and eventually a team that stops believing the plan.
A good football coach doesn’t just shout “let’s win” and walk away. They translate ambition into training blocks, tactics, drills, video review, nutrition, recovery, and selection. They adjust based on what the data says – not what they wish it said. Crucially, they also stop doing things: drop drills that aren’t working, rest players, change formation mid-match. Coaching is as much about what you cut as what you add.
In engineering, goals often fail because nothing ever gets cut – everything stays “priority #1.” The equivalent of good coaching is: define outcomes clearly, ensure goals are realistic and testable, set stretch objectives that challenge the team without breaking trust, track leading indicators not just lagging ones – and be willing to drop work that isn’t creating enough value.
Above all, beware the quarterly miracle. When targets slip, there’s a temptation to hunt for some dramatic intervention – a reorganisation, a pivot, a heroic push – that will magically recover 30% overnight. Athletes know better. You don’t become 30% stronger in a week. You become 30% better through small daily improvements that compound over months. The question isn’t whether you can pull off a sudden transformation. It’s whether you’re compounding daily.
Scoring is a team sport
One of the best things about IT is that almost anyone can score goals. Not in the shallow sense of “everyone writes code,” but in the deeper sense that impact isn’t confined to a single position. A great QA person can save a launch. A thoughtful product manager can prevent months of waste. A platform engineer can unlock the whole organisation’s speed.
But goal scoring is a team effort. Someone creates space. Someone makes the run. Someone times the pass. Someone finishes. The whole thing collapses if the defence isn’t doing its job. In IT, “defence” is the platform and reliability work: CI, test automation, observability, security. It rarely gets applause, but it’s what lets your best people take risks without the whole team getting punished for it.
If you want a team to keep scoring, don’t just celebrate the person who tapped the ball in. Celebrate the chain that made it possible.
Closing thought
Football analytics can go to unexpected places (sorry again, Barry). But the underlying principle is solid: the winners aren’t the ones with the most ambition – they’re the ones who measure, learn, adapt, and compound.
There’s a certain humility in that. Ambition says “I will do it.” Measurement says “Let’s see.” The best performers I’ve met aren’t the ones who believe the hardest – they’re the ones who look the hardest. And then they do something about what they find.
The scoreboard doesn’t care how hard you tried. It only knows what happened. The good news is that what happens next is still up for grabs – and unlike Barry’s wife, you get to choose what you measure!