All too often, when government, technology, and safety intersect, the Pogo line from our Vietnam-era past comes back to haunt us.
No time like the present. In light of the Federal incentives to adopt electronic health records systems and the "meaningful use" criteria, I've been wondering how soon we would start seeing adverse event issues. Look no farther.
The Boston Globe published a fascinating and chilling account of these on July 20 ("Hazards tied to medical records rush - Subsidies given for computerizing, but no reporting required when errors cause harm").
Several observations occur to me.
- I've heard it said "Software can't hurt or kill anyone." Maybe this is strictly true at the bottom-most level, but I'm sorry folks: anything which involves automated processes and medical treatment inevitably has a safety component. Just think about the examples in the Globe article. Remember the Aristotelian expression "Deus ex machina" (God in the machine): if an order or a piece of information came from the computer, then it had to be right, didn't it? I work in software quality, so I know better - but too many who use electronic health records systems don't understand that. Why can't we design such systems with hazard mitigations built in - internal checks to prevent overdosing, underdosing, or unnecessarily treating a patient?
- Every good developer knows that great software starts with understanding the end user - his/her environment, workflow, understanding level, key concerns, and such. Why, then, do we have health records software systems in place at so many hospitals and clinics where the doctors and nurses complain that it's cumbersome, confusing, and just plain difficult to use?
- We may complain about the FDA being slow, cumbersome, bureaucratic, and worse - but the FDA is to be lauded on one item. Everything they review is subject to two key questions: "Is it effective?" and "Is it safe?" Somehow, those who put in place the incentives for electronic medical records systems have forgotten about the second of these criteria. Why?
- Every other industry with a recognized safety impact - automotive, nuclear power, airline transportation, chemical manufacturing, and of course pharmaceuticals and medical devices - is subject to some kind of mandatory safety reporting. These industries do just fine in the U.S., despite some of the incredible safety stories that get reported. Notice, in the Globe article, that the Food and Drug Administration and the ONC (Office of the National Coordinator for Health IT) have declared that "... they do not intend to exercise safety oversight of clinical record-keeping or of computer systems that manage prescriptions. Nor do they plan to impose a system of mandatory death and injury reporting." In addition, the electronic health records industry representatives universally oppose mandatory safety reporting. What do they have to hide?
Mind you, I understand that there's plenty of blame to go around. Systems are badly designed with little thought about UX (user experience); implementation has created a number of strange hybrids with confusing overlaps; and the inability of different healthcare departments to communicate with each other (an issue regardless of the electronic medical records mess) all contribute to the tenuous safety environment.
I happen to believe that the cup is half full, not half empty. These systems CAN improve care, enhance patient safety, and lower healthcare costs. How many more deaths and near misses will it take, however, before we get there?