67218e31d71171590b5104fd Trish Kerin Icheme Safety Centre

Hunt the platypus to expose process safety risks

Oct. 30, 2024
Is it a harmless duck or a venomous male platypus lurking in the bushes? Trish Kerin helps identify the biases that cloud our safety judgment

When she found herself standing in a puddle of jet fuel years ago, Trish Kerin started looking for weak or invisible signals that could indicate imminent process safety incidents.

“We were installing a jet fuel line with a new ball valve and a spring-loaded coupling, but found the valve was sticky and wouldn’t close. I asked the technician to disconnect the coupling, but it still wouldn’t close, and he was soon covered in jet fuel, and I was standing in that puddle,” said Kerin, who is now director of the Institution of Chemical Engineers (IChemE) Safety Centre. “We later discovered that the ball valve had been installed upside down, and we’d bent the coupling spindle and caused the fuel spill. The hard-to-turn valve was the weak signal that we’d missed.”

Kerin reported that users in the process industries often talk about black swan events that can’t be foreseen or prevented. However, she believes there are no black swans—just weak signals that aren’t being seen or otherwise missed and ignored. She presented “Seeing the invisible” at the YNOW2024 conference in Houston.

Kerin and her technician’s experiences and those of countless others are the reason greater understanding is needed about what weak signals actually are. “They can be disconnected datapoints or they can be just a gut feeling,” said Kerin. “We’re usually told to ignore gut feelings, but they’re being reexamined because they’re often related to the brain recognizing patterns.”

Identifying ducks and biases

Kerin’s metaphor for seeking weak signals is the duck-billed platypus, which also has a tail like a beaver, webbed claws like an otter, and lays eggs despite being a mammal, while it’s males have a sting with toxic venom.

“If we only see a bill, we may think it’s a duck that can’t hurt us, but it may actually be a platypus that can hurt us,” explained Kerin. “This is the lesson of weak signals that may be waiting to sting us. We often don’t see and recognize weak signals, and this is due to all the biases in how we think. Biases usually help us deal with lots of data, so we can make decisions, but those decisions and their results can also be wrong.”

All the basic biases

Kerin reported these biases include:

  • The ostrich effect, which involves sticking our heads in the sand because we don’t want to see what’s going on. It’s resolved by encouraging people to be curious, and think about what’s actually happening.
  • Confirmation bias, which is looking for what we want to find. The solution is to ask others what they’re seeing in a dataset and compare it to what we see.
  • Anchoring bias, which is asking everyone what they think, but then soon realizing that no new ideas are coming up because everyone is agreeing with and confirming each other. This can be corrected by asking everyone to write down their ideas before they reveal them.
  • Framing bias, which is noticing a thing or situation more often because one’s brain has been triggered ahead of time to look for it. This bias can be turned into a positive by talking about projects or hazards, so that users will be ready to recognize problems when they appear.
  • Illusory truth effect, which is looking at data in the usual way, and thinking it indicates one situation when the opposite is actually true. This is similar to the cautionary watermelon model for process safety, which looks like a green situation on the outside but is actually red on the inside. The solution is digging deeper into data and metrics to confirm what’s actually true.
  • Curses of knowledge are the things that everyone thinks they know is true. The solution is, when work assignments are made, ask colleagues to repeat them back, so they can show they understand what’s been requested.
  • Pareidolia is erroneously assembling pieces of unconnected data into images that appear meaningful but aren’t. The fix is gaining different viewpoints or data to gain a true picture.
  • Anecdotal fallacy is believing events we’ve never experienced can’t happen and can’t be real. The cure is likely more experience.
  • Collective probability bias applies to situations that have a very small chance of occurring, so they’re ignored due to thinking they won’t happen.
  • Information bias is continuing to gather data until it’s no longer useful, waiting too long to make decisions, and just getting more conflicted. The solution is bringing in a second set of eyes to check what we’re doing.
  • Illusion of control is performing routine tasks and growing complacent enough to ignore weak signals.

Hunting pesky platypuses

To find weak signals, determine what they’re saying, and manage them, Kerin advised following her “platypus” action plan for feeling, investigating, naming and determining what to do about them. Its eight steps include:

  • P is for “partial sight.” This is like glimpsing a duckbill, and not being sure who it belongs to. It can manifest itself as unexpected alarms and near misses, and merits plant observation and audit findings.
  • L is for “link the data.” It asks: what else is happening, where is the weak signal appearing, what other information sources do you have, and are there other weak signals occurring at the same time?
  • A is for “assess the data.” It asks: what are the specific conditions and process parameters? It recommends looking at data from a different perspective, challenging assumptions, and inquires what can be seen by looking at the data differently?
  • T is for “task and timing.” It asks: did the weak signal happen during a specific task, at a particular time during a task, during specific time of day, and were there any common situations at that time?
  • Y is for “yesterday and yonder.” It asks: has this signal or event happened here before, did it happen somewhere else, what conditions existed when it happened before, are there similarities, and were previous incidents accompanied by these conditions?
  • P is for “perceive the scenarios.” It asks: is this situation a potential incident or just a plant upset, are there other potential incidents nearby, what are the hazards and possible consequences, and can the risk be assessed?
  • U is for “understand the controls.” It asks: what controls can you put in place for each scenario, how will you know if the controls are working, and what are their performance criteria?
  • S is for “secure the platypus.” Once weak signals are found, it advises implementing controls, communicating with coworkers to consult on risk assessment, documenting the risk assessment and controls, and reviewing and auditing their effectiveness.  

“Most users have a lot of data coming in, so there are already many weak signals happening all the time. If users tried to check them all, they’d have no time to do anything else,” explained Kerin. “The question is whether another control system can perhaps be enlisted to deal with weak signals in the background? At the same time, we can’t just add more controls and barriers that aren’t really doing anything because they don’t make us any safer. They just make us less safe.”

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.