The Box-Checking Myth
The fluorescent hum of the conference room felt like a physical weight, pressing against the temples of the thirty-four managers gathered there. At the front of the room, a young consultant was clicking through slides of a Boeing 744 stickpit, explaining how pre-flight checks had reduced catastrophic failure rates by nearly ninety-four percent since the late 1970s. It seemed like a logical bridge to build-aviation safety protocols applied to surgical theaters. But then came the sound that always signals the death of progress: a sharp, derisive snort from the back row.
‘We are not airline pilots,’ said Dr. Aris, a neurosurgeon with twenty-four years of experience. ‘A pilot follows a predictable path. My patients are not machines. Our work is infinitely more complex. You cannot reduce a human life to a checkbox.’
He wasn’t just defending his profession; he was defending his ego. This scene, which played out in a sterile hospital in 2014, is mirrored every single day in the construction industry, the chemical sector, and heavy manufacturing. We are remarkably, almost pathologically bad at learning from people who don’t wear the same color hard hat as we do.
1. The Pathological Tribalism
I spent 84 steps walking to my mailbox this morning, thinking about that specific brand of stubbornness. Each step felt like a reminder of the linear, predictable nature of physics-the very physics we try to outrun with our professional myths. The refusal to adopt ‘foreign’ innovations is often framed as a technical critique, but it is almost always a cultural defense mechanism.
When Gravity Doesn’t Care About Titles
In the construction world, we see the same pattern. A safety officer suggests a protocol for site communication that mirrors the ‘closed-loop’ communication used by air traffic controllers. The site lead, a man who has survived 44 winters on high-rise projects, scoffs just like the surgeon.
π§βοΈ / π·
Professional Ego
βοΈ
Gravity & Physics
‘A building doesn’t fly,’ he says. No, it doesn’t fly, but gravity works exactly the same way on a fallen wrench as it does on a failing engine. If we admit that a pilot has figured out something we haven’t, we admit that we aren’t the sole masters of our domain. We admit we are vulnerable to the same systemic flaws as everyone else.
The Rebellion of Data
Elena W.J., a thread tension calibrator I worked with briefly during a massive industrial overhaul, once told me that the hardest part of her job wasn’t the machinery. It was the people who thought they knew the machines better than the data did. She’d spend 14 hours a day adjusting the microscopic pull of industrial looms, only to have a floor manager override her settings because ‘that’s how we’ve always done it.’
Elena’s Insight: Stability Needed for Complexity
88% Consensus
Elena understood something that the Dr. Aris’s of the world refuse to see: complexity is not an excuse for chaos. In fact, the more complex a system is, the more it requires the ‘boring’ stability of standardized protocols. Elena’s precision was a quiet rebellion against the ‘not invented here’ syndrome.
When ‘Captain as God’ Kills Progress
When we talk about ‘Not Invented Here’ syndrome, we’re talking about a psychological wall that prevents the horizontal transfer of knowledge. In aviation, after the 1994 crash, the industry realized the primary cause wasn’t mechanical-it was the ‘Captain as God’ culture. Co-pilots were too afraid to point out mistakes to senior captains. They developed Crew Resource Management (CRM) to flatten the hierarchy during critical moments.
Junior voice silenced
Critical intervention enabled
Fast forward to today, and construction sites are still struggling with this exact same dynamic. A junior laborer sees a frayed cable on a crane but hesitates to speak up to the veteran operator. We treat every sector as a unique snowflake, ignoring the fact that the underlying causes of disaster-fatigue, ego, miscommunication-are universal human constants.
The Connective Tissue of Safety
We need a common language for safety, one that transcends the specific tools we hold. This is where standardized frameworks become vital. They act as the connective tissue between the siloed brain of the surgeon and the siloed brain of the site manager. Organizations like
Sneljevca recognize that safety isn’t a niche skill; it’s a foundational discipline that requires a shared vocabulary.
Instead of 4 minutes implementing solutions.
The VCA (Veiligheids Checklist Aannemers) is, at its heart, an admission that there are universal truths to risk management that apply whether you are pouring concrete or maintaining a chemical reactor. But even with these frameworks, the resistance remains. We see it in the way technical jargon is used to exclude outsiders. ‘Near-miss’ in aviation is a ‘sentinel event’ in healthcare. Why? To maintain the illusion of uniqueness.
Old Mentality
“Tough it out”
Smart Systems
Requires Transparency
The irony is that the ‘complexity’ cited is the very reason we need to steal ideas from others. Aviation became safe not because it was simple, but because it was so complex that it had to become transparent.
A Shared Vocabulary is the Foundation
Paving the Road Ahead
I think back to Elena W.J. and her looms. She knew that if the tension was wrong, the whole fabric would eventually tear, regardless of the material’s ‘prestige.’ Our industrial silos are high-tension environments where the threads are starting to snap. We need to stop looking at our own feet and start looking at how the person in the next field over is walking.
The hardest thing to learn is that you are not special.
If we want to stop repeating the mistakes of the past, we have to kill the ‘Not Invented Here’ monster. We have to embrace the discomfort of being a student in a room where we thought we were the masters. It’s not about losing our professional identity; it’s about ensuring that our identity doesn’t become a tombstone. The next time you see a solution from another industry, don’t look for the reasons it won’t work. Look for the reasons you’re afraid to let it in.