Listen

Description

About the Guest:

Paul Gruhn developed the world’s first commercial safety system modeling software in 1994, served as ISA president in 2019, and co-chaired the ISA 84 Standard Committee for years.

Paul’s philosophy:

Doing the modeling doesn’t make your plant any safer. People don’t have accidents because you got your model wrong.

Episode Highlights:

🎯 From Forensics to Functional Safety

Gruhn shares how a chance encounter at an ISA show launched his 40-year career in safety engineering. Working alongside a 70-year-old English reliability consultant who later passed away, Paul unexpectedly inherited his role and dove deep into ISA standards work, presenting at DuPont, where he caught the attention of the 84 Committee chairman. His journey reflects how expertise develops through seizing opportunities and committing to lifelong learning.

đź“‹ The Battle for ISA 84

Get an inside look at the heated debates behind safety standards development. When the 84 Committee started, 50 people would show up to three-day meetings where raised voices could be heard through hotel partition walls. Markov modeling advocates clashed with algebraic equation supporters and fault tree proponents, yet none could agree on the “best” method. The solution: publish all three approaches with examples and let practitioners choose what works for them.

🏗️ The Jenga Tower of Management of Change

Discover why management of change failures—not calculation errors—cause real-world disasters. Gruhn uses a powerful Jenga tower analogy: at design, all pieces are in place, but poor change management slowly removes blocks without replacement. The tower still stands, so management assumes safety, but one small push causes total collapse. This pattern explains Bhopal (five years old when disaster struck), Texas City (distillation column overflowed eight times before the fatal accident), and countless other tragedies.

🤖 The AI Dilemma

Learn why Paul is “terrified” of AI’s role in process safety. Junior engineers will trust AI outputs without understanding the gaps, while downsizing eliminates senior experts who can spot missing elements. Unlike hand calculations that can verify models, AI gives different answers to the same question twice—dangerous when lives depend on consistency. Currently, AI may help populate HAZOPs or safety reviews, but relying on it completely risks creating engineers who don’t know what they don’t know.

đź’° Bean Counters vs. Engineers

Explore how financial managers replaced technical leaders in plant operations over 50 years. These “bean counters” don’t understand plant operations and slash budgets for things they can’t comprehend on spreadsheets. The result: increased reliance on contractors who lack process knowledge, maintenance outsourcing to service companies, and owner-operators reduced to managing contractors rather than doing work themselves. Paul describes walking into facilities where he immediately wanted to leave after 30 minutes of questioning.

đź”’ Cyber Security Reality Check

Discover how cyber security failures mirror functional safety failures—they’re not product problems, they’re human problems. Gruhn shares a real case where a DCS vendor’s virus-infected laptop shut down an entire oil processing facility because the DCS, safety system, and office network all shared the same infrastructure. The safety system did exactly what it was designed to do when communication failed, yet the end user blamed the vendor rather than their own poor network segregation.



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit parakeetinc.substack.com