top of page
Writer's pictureStephen Harden

What Healthcare Must Learn from the 737 MAX




Two fatal accidents in new Boeing 737 MAX airline jets reveal two important patient safety lessons for all physicians, mid-levels, nurses, staff and administrators.

If you fail to learn them, you will make the same 2 mistakes that doomed almost 350 passengers.


First, understand the 2 methods of designing a Safe Operating System


One method is to design an operating system that is triply redundant. This means that critical operating systems have a computer to manage the operations of the system, another computer to monitor the system operations and act as a back up in case of failure of the primary system, and a third computer to be the tie-breaker in the event there is a discrepancy between the primary and the back up system.


All newly designed fly-by-wire airliners are designed to be triply redundant. The certification standard for these systems is one failure in a billion. Examples of these aircraft are the B777, B787, and A380.


The second design standard uses the age-old concept of using the human pilots as a critical element of the system. Before fly-by-wire (FBW) came along, nearly all critical systems in all sizes of airplanes counted on the pilot to be a crucial part of the system operation. Examples of aircraft designed and certificated under this system are the B727, the original B737, and the Super 80.


The certification concept for relying on the human involves identification of a failure, and a reaction time. The way it works is that the pilot must be able to recognize the failure, then take three seconds to analyze what is wrong, and then take corrective action before the airplane flies into a critical condition.


The 737 MAX is not a new fly-by-wire design - rather it is based on a 40-year old airplane and thus a combination of the two design methods. While some of the updated and modern features of the MAX may have triple redundancy, the operating system believed to be the cause of the accidents was designed, and certified safe for flight, with humans as the critical back-up element in the system.


The Critical Lessons for Healthcare


1. For all technology you use to treat patients, you must know who, or what, is the back-up. If the EHR, medication device, pump, etc. that you use to treat patients is not designed to be triply redundant, then YOU are the back-up. Are you familiar enough with the design of the equipment that you know if it was designed with you as the critical back-up for any failure modes? If not, you have too much trust that the device will always work as designed. The technology WILL fail someday and your patient will be harmed. This is not a recipe for high reliability healthcare.


2. If you are the back-up, you must know the likely failure modes. For those airplanes without triple redundancy, safety and reliability is predicated on the pilot being able to recognize the failure and take action within three seconds. While clinicians typically have more than 3 seconds to react, they have much less time than they think to recognize that a high tech device is not working properly and to take action to prevent patient harm. If you are the designed back-up you must answer three questions:

1. What is most likely to fail?

2. How would I recognize that?

3. What would I do about it?


3 Things I Believe the Accident Investigation will Reveal


The investigations on the cause of the two accidents are not complete, and there is always great danger in speculation and taking action on unfounded judgments. However, based on what is already known, I believe these three points (among others) will be mentioned in the completed investigations:

1. Pilots were not aware they were the primary back-up to the flight augmentation system.

2. Pilots were not sufficiently trained to recognize the failure modes of the system.

3. Pilots were not sufficiently trained to react within three seconds to the failure mode.


Please don't repeat these mistakes with your patients.


Want to get your healthcare team trained to back each other up? The evidence base is clear - a culture of accountability where teams crosscheck each other and speak up when they perceive a problem with patient care produces more effective patient safety and quality initiatives. Schedule a call today.


19 views0 comments

Comments


bottom of page