Monday, June 9, 2014

Live From Safety 2014: Human Error & Safety

Post by Safety 2014 Guest Blogger Steve Minshall, CSP, CIH

Who Has My Socks?

“So just who is it that’s replacing Todd Conklin?” I remarked to my newfound colleague, Barb Semeniuk (it means “Smith” in Ukrainian, I was informed), from Edmonton, Alberta, as I settled into my seat in the Human Error and Safety pre conference seminar at ASSE's Safety2014. Barb pointed to the bio sitting in front of me and then to Bob Edwards who was then standing at the front of the room. “He’s the guy,” she said with no hint of appraisal.

I wasn’t sure who to expect since I hadn’t paid close attention when I received a phone call from ASSE telling me that Todd Conklin, the originally scheduled speaker for the seminar, was not available and that someone unknown to me was taking his place. Did I want to stick with this seminar, the caller intoned? “Yes, yes,” I replied though not because of any recognition of the new speaker‘s name or credentials but because the topic still interested me even if the speaker didn’t.
OK, so I came to the seminar in a bit of a blasé funk because I was disappointed not to get to see and hear the renowned safety thought-leader and speaker, Todd Conklin. I should not have worried. In my vernacular, Bob Edwards knocked my socks off not only with the information he presented but also with his engaging presentation style.

Just who is Bob Edwards? He is the human and organizational performance advocate for General Electric – now tell me, who else do you know that has a cool title like that? Bob has a military bearing, as well he should, having spent 6 years in the Army; he stands more than 6 ft tall, and looks the part of a serious person. He’s 52 but looks like he’s in his 30s. But looks can be deceiving and Bob turns out to be one of the nicest and most engaging people I’ve had the pleasure of meeting. (He also has a son in the Marines who just returned from active duty in Afghanistan – many thanks to Bob and his son for their service to our country.)

His degree is in mechanical engineering and his resume includes work in design, mechanical and technical support teams and as a safety leader. And, as it turns out, Bob is not just some hapless recruit brought in to fill in for Todd Conklin (who had to jet off to New Zealand). Bob has worked with Todd and a student of the Conklin philosophy. He may have his own presentation style but he’s clearly an advocate of Dr. Conklin’s work. With apologies to Dr. Conklin for no longer being disappointed that he could not be here to speak, Bob was terrific and he exceeded my expectations for the content and delivery of the material.

We covered many topics but the central messages I heard were these: 1) Blame and punishment will not advance our safety efforts; 2) learning and improving, as mundane as those two words may sound, supplant blame and punishment as the means to safety improvement; 3) compliance enforcement is important but it does not equate to more safety; 4) we can do better incident investigations by starting with the process; 5) how work is accomplished as we imagine it is not the same as  how work is accomplished in reality; and 6) we should be less interested in human error and more interested in learning. Let me briefly explain each of these.
  1. Blame and punishment. I have to admit I’m not big on punishment. And yet I, and perhaps like you, have experienced situations where people have been disciplined or even let go as the result of an accident – and this happened before the investigation was fully underway or the facts fully known; those doing the firing felt justified because they had removed a “problem” from the company. Bob described this phenomenon in the recent horrific sinking of a South Korean ferry, resulting in the deaths of many school-age children. The ferry captain, who by accounts is a good man and had actually been trying to save the ferry, is under arrest for murder. The overriding emotion, though, is that those in charge desperately want to know that the “system” is in balance and if the captain is out of the system, everything will be okay again. What does firing the captain, charging him with murder and arresting the crew truly do to help understand this tragedy? What learning takes place under these conditions?  Those in charge may feel morally justified and they may have satisfied an intense need to re-exert control but these actions do nothing to advance learning and the opportunity remains for another ferry to sink.
  2. Learning and improvement. Perhaps the concept of the value of learning and improvement over blame and punishment is not so new, especially when you think of Stephen Covey’s advice to “seek first to understand and then to be understood.” In the context of this seminar, though, Bob advocates taking an approach that allows those involved in an incident to tell you their story – and there may be multiple stories from several people – because it is the sense we make out of these stories that allows us to learn. Learning ties into understanding which allows us to design better defenses and we then have more success in preventing injuries.
  3. More compliance does not equal more safety. Compliance is necessary but not sufficient. If you work in a system that has many problems, compliance along with the rules it brings will help make the system safer. As safety improves, though, more rules and more enforcement do not bring along greater safety. “Doing” safety harder and with more intensity does not make for greater safety. If your approach is compliance and enforcement, these words seem heretical. If you work where compliance and enforcement are the norm but you realize that new gains in safety are not coming like they once did, the light may be coming on and you’re nodding in agreement. Keep nodding; you are right.
  4. Start with the process when conducting incident investigations. A common approach in incident investigations is to start with the event and work backward. Everything seems so linear: this happened which was preceded by this happening and that was preceded by something else and so on. This linear construction can happen because of the availability of 20:20 hindsight – we already know what happened so we can think of it in this straightline fashion. Such an approach disregards how the people doing the job observed things in the moment. The now-known outcome may not have entered their thinking at the time. Therefore, Bob's recommendation is to first find out about the process and understand it from the point of view of the person who actually does the work. Taking time to listen to and understand that narrative is likely to reveal more information than you would ordinarily uncover in the typical hazard hunt method of investigation.
  5. Work is not the straightforward thing we imagine it to be. Bob illustrated this point by showing us a straight black line that represented the way work is imagined to take place. He then showed us a blue line that represented how work takes place in practice; the blue line was the more tortuous path, full of diversions, interruptions, and other realities of work life. Both lines lead to the same outcome but the blue line represents how things really get done. The black line, if that’s your view of work, is somewhat judgmental and shows a lack of understanding of what it takes to get a job done. Without taking the time to understand the blue line, we cannot learn what we need to know to design better defenses and prevent injuries.
  6. Be less interested in human error and more interested in learning. My takeaway here was that we need to acknowledge that we humans are imperfect and we will make errors. If we can get past that, we can learn more and consequently design systems that are more error tolerant and resilient. Automotive engineers have learned this lesson; they know people will have crashes (make errors). So now they design and build automobiles with crumple zones, lane drift indicators; crash avoidance systems; early braking systems; and survivability zones built around the use of air bags and seat belts. These design engineers get it and so should we.
It is a bit of a leap of faith to accept that we are imperfect, that we should acknowledge and embrace that, and that we should design work systems to be more error tolerant and resilient. This thinking may startle some and outright displease others who are strongly invested in the way we’ve been working at safety for a long time. But I find this to be fascinating stuff, perhaps not yet perfected, but still a move beyond the classic ways of thinking about safety. It is exciting to be around people who embrace this new approach and, frankly, today’s session knocked my socks off. If you find them, please let me know.