Flawed technology, flawed humans
Recently I borrowed my father's car and noticed that the dashboard's oil indicator light was illuminated. When I mentioned this to my dad, he said, "Don't worry. It doesn't mean anything. It's always on."
Hmm. So I asked him, "If it's always on, how do you know when something is wrong? If there really is an oil leak don't you risk burning up the engine?"
Crickets from my dad.
I suppose an always-on oil lamp is the automobile version of alarm fatigue in healthcare, when a doctor clicks through yet another drug-drug interaction warning with barely a glance. Or when staff ignores an alarm at a nurse station because 90 percent of the unit's alerts are medically insignificant.
Most of the time, the alarms and warnings don't mean much. Until a patient is administered a medication with potentially lethal contradictions or when a patient falls and is left unattended.
The opposite of alarm fatigue (and ignoring malfunctioning dashboard lights) is blind trust in existing people, processes, and technology. It's that sort of trust that almost killed Dennis Quaid's twin infants when nurses precisely followed dosage instructions – and gave the babies 1,000 times the intended dose of heparin.
Clinical staff at several Lifespan hospitals similarly trusted software that was later blamed for incorrectly generating dosing instructions with the wrong form of a medication. Although no harm was done, more than 2,000 patients were impacted.
Patient safety advocates can provide endless examples of medical mistakes due to the faulty use or design of EHRs, such as the incorrect selection of items in a dropdown menu; the continuous copying of erroneous data from old notes into new notes; or notes inadvertently entered into the wrong patient's chart.
Yes, technology has its flaws. But of course, human beings are flawed as well.
One of the biggest and most human mistakes we make is to assume. This can be particularly dangerous when incorporating technology into healthcare. There's risk in assuming that an alarm is not meaningful, just because 99 percent of the time it is insignificant. Likewise, assuming an EHR is error-free can lead to deadly consequences.
Regardless of whether users are ignoring technology or blindly following it, ultimately it's up to the individual caregiver to fully consider his or her medical training and use common sense – which includes always being on guard for that 1 percent of the time when something out of the ordinary might occur.
The ONC seemingly agrees. Last year the agency introduced nine EHR Resilience (SAFER) Guides to tackle the issue of making health information technology safer, including the SAFER Guide on Organization Responsibility. The Organizational Responsibility guide includes best practices for helping organizations monitor the safety and safe use of health IT, as well as recommendations for keeping clinicians, staff, and technology developers continuously engaged and focused on safety.
Specifically, the guide notes that "safe EHR implementation is critically dependent on the people involved" and requires "attention to social as well as technical matters." All the guides combine evidence-based guidelines with practical advice and designed to give organizations a framework for self-assessment and the development of customized safety improvement plans.
Technology holds great promise to improve healthcare safety, efficiency, and quality, but human oversight is still essential. Perhaps we can ignore a defective oil lamp on the dashboard, but in healthcare we can't afford to be so complacent.