A Flight Plan for Healthcare: Diagnosing Why We Need More Efficient Regulation
Having recently sat at the controls and flown a plane for the first time, this post will explore the intersection of healthcare regulation and my newfound personal interest in aviation. I suppose the post can be filed in the “economist gets a hobby, but ends up thinking about economics anyway” section. If this is your thing, see Can a tiny fish and an old hot dog teach us anything about whether health insurance reduces mortality?
The Swiss Cheese Model, developed by James Reason, is a simple model that’s widely applied in aviation. The model illustrates how multiple layers of safety systems and procedures, none of which are individually foolproof, when stacked make aviation incredibly safe. In fact, with enough carefully designed layers, you prevent all accidents except for those rare occasions when the holes align perfectly, often due to a unique combination of previously unforeseen circumstances.
This model is also a good dynamic description of accident prevention in aviation. Accidents and even minor incidents are investigated by the airlines and aviation authorities across the world, leading to reports with recommendations to add more, or tweak existing layers, in order to prevent the previously unforeseen alignment.
To give you an example, the Airbus A320 that Captain Chesley "Sully" Sullenberger famously landed on the Hudson River had fly-by-wire flight controls powered by the aircraft's engines. In the rare event of an engine failure (ChatGPT says a rate well below 1 in 100,000 flight hours), the pilots would still be able to control the plane using power from the other engine. In the event of a dual-engine failure (an event so rare ChatGPT won’t give me a rate, but instead lists just two examples), the plane actually has two additional systems! The Ram Air Turbine, an almost cartoonish windmill device that will automatically pop out from under the plane and generate enough electricity to power the most essential systems, and an Auxiliary Power Unit (APU), a gas-powered turbine capable of maintaining all flight-control systems, including the more advanced safety features (ChatGPT tells me there’s are also emergency batteries that can power the most critical systems and instruments, but I think we all get the idea at this point).
That’s great! It would be crazy if we didn’t try to learn from our failures, and in the Swiss Cheese model, it is these kinds of redundancies that make the whole enterprise safe. But while it’s a great starting place, much like economists will start to think about virtually anything through the lens of supply and demand, let this dynamic play out long enough, and perhaps a different model would be better.
Even if the simplest version of this model will always get more safety by adding layers, you’ll notice that as you keep adding layers, each subsequent layer is almost entirely redundant. This should make you wonder what the model is not showing?
The obvious answer is that the model fails to convey the cumulative costs that adding more and more mostly redundant slices can have in the other direction. Gordon Tullock likened this kind of regulatory accumulation to throwing rocks into a river: while no single rock noticeably alters the river's course, the collective impact over time directs the stream in unintended and unpredictable ways.
Captain Sully’s experience with US Airways Flight 1549 , which hit a flock of geeze shortly after takeoff, actually highlights the crux of the issue. Standard procedure with a dual-engine failure involves the pilots executing check-lists of specific steps (not unlike the MDS forms nursing homes are required to complete on a regular basis, for example). The check-list Captain Sully and his first officer were meant to execute can be seen here, but was unfortunately designed for dual-engine failures at cruising altitude, calling for multiple rounds of specific configurations of buttons and switches, each round followed by 30 seconds of wait time, before attempting to reignite the engines.
Keeping his wits about him, Sully’s decades of experience allowed him to realize immediately that complying with the checklist, designed for a scenario that differed from the one he was facing on just one critical margin, would doom his flight of 150 passengers. Sully instead chose to deviate from protocol by activating the plane’s Auxiliary Power Unit just two seconds after hitting the birds. This “Sully” moment, an almost instantaneous decision, returned safety features to the flight control systems and was pivotal to saving the lives of all 150 passengers.
Readers of this blog will not be surprised to learn this incident led to a series of requirements and recommendations, including — you guessed it — a checklist for dual-engine failures at low altitude. Now that list may be a good idea, even if dual-engine failures at low altitude are extremely rare, and I certainly won’t take a stance on whether there are too many safety layers in aviation after one flying lesson.
But what I will point to, is that while Sully is rightly portrayed as a hero both in the movie starring Tom Hanks and in the NTSB’s official report, that same report actually describes another Sully moment that we can learn just as much from. As Sully was landing on the Hudson, the report describes that he experienced "task saturation" due in part to the extensive and lengthy checklists and protocols that were poorly designed for the scenario he was in. While intending to realign a previous hole in the layers of the swiss cheese, the inclusion of procedures for scenarios both with and without fuel remaining, along with several ditching procedures, actually contributed to the complexity of Sully’s situation, and led him to approach the landing on the Hudson at a lower than intended speed. Interestingly, the safety features Sully brought back by activating the APU were critical to the landing being relatively smooth despite the low speed, and ultimately to the survival of everyone on the plane.
While the immediacy of the consequences differ, my claim is that a regulatory approach of requiring very specific procedures for almost any aspect of healthcare delivery leads doctors and nurses to be task-saturated every single day.
Take something seemingly simple like administering medications. Nurses are required to double-check medications against the patient's chart, confirm the patient's identity, and document the administration details in a medication administration record (MAR). Now do this for 20 patients, and you see where I’m going. While each of these steps could prevent the unholy alignment of holes in the layers of swiss cheese, there’s no safeguard in this model that ensures we prevent that alignment in the lowest cost way.
Of course, following a slightly imperfect procedure won't harm a patient in a nursing home, the way Sully's impossible check-list would doom his passengers — it will just waste some time. But that's exactly the point. The accumulation of slightly imperfect procedures followed for every single patient, every single day, will add up to dramatically less time for patient care. But with each imperfection being minor, never would a doctor or nurse recognize the higher imperative, and deviate from their checklist, the way Sully ignored his checklist to activate the Auxiliary Power Unit.
We need a collective Sully moment in healthcare regulation, but the point I’m trying to make is not that we should ignore rules and procedures. A "Sully" moment in healthcare regulation would be for us all to recognize the cumulative costs of ever more protocols. We should recognize that no one set of procedures can be optimal for the vast variety of circumstances that occur across 15,000 nursing homes on any given day, and that it is impossible to devise optimal procedures for all or even most circumstances, without simultaneously creating an extremely large and cumbersome set of procedures that wastes crucial time and resources.
No amount of tweaks can prevent anything bad from ever happening. What we can prevent, however, is that the accumulation of layers and protocols becomes a burden that directs well-meaning people away from serving their patients to remain compliant. The cumulative effect of compliance is not just delays, but ultimately reduced time for patient care, so we should reevaluate our approach, to ensure that the safeguards we have in place enhance rather than impede the ability of providers to deliver care, and that we start getting rid of the layers that don’t.