What you see is not all there is: Fixing the wrong holes in organisations.

Austin Mackesy-Buckley
4 min readFeb 4, 2025

--

Photo by Travis DeLong on Unsplash

During World War II, engineers studying returning bombers recommended reinforcing the areas damaged with bullet holes and shrapnel — typically in the wings. However, Abraham Wald, a mathematician, pointed out that this approach ignored the unseen: the planes that didn’t return. The damage to returning bombers wasn’t the problem; it was the absence of damage to areas critical to survival — like engines — that marked the true vulnerability.

It’s quite possible you have already heard this story widely used to describe survivorship bias.

Organisations have a tendency to fall into the same trap. Survivorship bias teaches us a valuable lesson about focusing on what we see rather than what we don’t. We often focus on the problems we can easily observe, and we jump to solutions that “patch the holes we see” rather than addressing the deeper vulnerabilities. For example:

  1. No Focus? We implement OKRs or goal-setting frameworks.
  2. No Transparency? We create more reporting and update meetings.
  3. No Control? We buy expensive tools, standardize processes, or impose rigid hierarchies.

Just like no one initially asked what’s wrong with the bombers that don’t return, it’s common for underlying issues to be overlooked in organisations too.

  1. Lack of focus might stem from misaligned incentives or conflicting priorities, not just the absence of a goal-setting framework.
  2. Lack of transparency might reflect a culture of fear or poor communication, not merely the absence of reports or updates.
  3. Lack of control could result from trust issues or unclear decision-making authority, not a lack of tools or processes.

A few things make this tendency worse.

Bias for action

Western culture in particular values “go-getters”, the doers of society. It’s no different in organisations where naturally people who provide immediate solutions and are visibly active are typically viewed as more productive and “useful”, than say those who tend to inquire and observe more.

They typically are good at “making it work” within existing constraints, therefore are less likely to rock the boat and uncover uncomfortable truths about leadership or complex systemic efficiencies which can threaten or overwhelm people.

Obsession with measuring

There is a saying of debated origin, “If you can’t measure it, you can’t manage it.” Whoever said or didn’t say it, organisations understandably want and should track metrics that indicate health and/or impact of various aspects and solutions of their business so that they can react accordingly. However, not everything worth measuring can be measured. This has a few consequences:

  1. Inadequate metrics are used to measure something that is not easily measurable — leading to incorrect assumptions and action.
  2. Whatever the organisation does measure typically becomes more “visible” than what they don’t — potentially taking attention away from equally important but immeasurable indicators.

Tunnel-vision

All too often, thought-provoking questions and alternative ideas (and the people who provide them) are seen as an annoying hindrance to those wishing to move more quickly or who believe the solution is obvious.

Abraham Wald, however, wasn’t just floating around in the background making random observations and suggestions. He was part of a team of statisticians known as the Statistical Research Group (SRG) who were specifically tasked with using their skills to solve various military problems. The leadership of the United States recognised the potential of scientific research and development in aiding the war effort. This gave the SRG legitimacy and meant they were able to work in a system that valued their skills.

To be able to find the right holes to fix, organisations need to be willing to pause, listen to (and seek out) diverse opinions, and keep all options on the table.

Shooting down your own plane

Ironically, this bias for action and misdiagnosis of the problem can also lead to undervaluing of people and skills that can actually help with identifying the real problem and finding more robust solutions.

In the story of the bombers, the visible “fixers” were the engineers who patched damage on returning bombers. Their skills were tangible, actionable, and aligned with the visible problems, much like those who implement OKRs, create reports and roll out and enforce new tools.

When you look at this from afar, Abraham Wald, brought a less noticeable but invaluable skill: the ability to step back, analyse the data, and question assumptions, a role similar to analysts, facilitators and coaches whose contributions, while less tangible, are critical to uncovering systemic issues and designing solutions that work.

Yet unfortunately, when there is pressure to reduce operational costs, become leaner, more efficient and to deliver faster, it is not uncommon for the “Walds” to be seen as overhead and the first to be let go. Months later, you may find yourself wondering why nothing seems to improve — why challenges persist, costs remain unchanged (or even grow), as the enthusiasm and talent of “fixers” are misdirected toward fixing the wrong holes.

Don’t let survivorship bias cloud your judgment — especially as a leader. In complex systems, what you see is rarely all there is. Whether it’s the problems or employees that are most visible, the real answers often lie in what remains unnoticed.

--

--

Austin Mackesy-Buckley
Austin Mackesy-Buckley

Written by Austin Mackesy-Buckley

An Agile Coach and dad who writes from time to time about his thoughts and experiences of work and life.

No responses yet