Hoffman and Podgurski: A relatively lawless industry and "meaningful use" of health IT, with safety as an afterthought

In their article "Meaningful Use and Certification of Health Information Technology: What About Safety?" (free PDF here), Sharona Hoffman (Professor of Law and Bioethics and Co‐Director of the Law‐Medicine Center, Case Western Reserve University School of Law) and Andy Podgurski (Professor of Electrical Engineering and Computer Science, Case Western Reserve) make an important case for what I've previously described as "putting the cart before the horse."

At my Oct. 1, 2010 post "Cart before the horse, again: IOM to study HIT patient safety for ONC" I argued that the IOM was only called in to study HIT safety after plans for national rollout were put into law, and a "stimulus to adoption" (with penalties for refusniks) financed at the cost of tens of billions of dollars.

I found this approach to HIT and the sequencing of events - the development of "meaningful use" criteria before usability and safety criteria - quite cavalier.

Hoffman and Podgurski go a step further.

They begin:

In the summer of 2010, the Department of Health and Human Services (HHS) published three sets of regulations to implement ARRA. This article briefly describes and critiques the regulations, arguing that (1) they fail to appropriately address HIT safety and (2) further steps must be taken to protect patients and serve public health needs in the new digital era.

After a brief review of the Meaningful Use and "Certification" (a.k.a. features qualification) regulations and programs, they go on to critique those regulations and programs as a "step towards comprehensive oversight", but a very deficient step considering the ambitions and timelines of the HITECH act and the federal government.

They aptly note (along with with footnotes):

... While advocates argue that computerization will reduce errors, numerous recent reports have demonstrated that the opposite can be true. Hospitals have experienced incidents in which doctors’ orders were posted to the wrong patient charts and electronic drug orders were not delivered to nurses who needed to dispense them to patients. A published 2009 review of almost 56,000 CPOE prescriptions found that approximately 1% of them contained errors. Patients who do not receive needed medication or whose treatment is otherwise mismanaged because of software or usability problems can suffer catastrophic consequences [my own mother is sadly familiar with this latter problem - ed.]

General system safety is a property that is attainable only through rigorous processes for development and evaluation. [Evaluations of the kind the healthcare IT industry seems to have steadfastly circumvented and avoided - ed.]

However, the regulations do not address certification of EHR vendors’ software development processes or even require vendors to analyze and mitigate potential safety hazards. [In other words, they are essentially meaningless in terms of HIT safety - ed.]

Furthermore, ATCBs [ONC 'Authorized Testing and Certification Bodies'] will use testing requirements developed by the National Institute of Standards and Technology (NIST) that are apparently intended only to determine whether systems include certain features. Passing such tests is not sufficient to ensure that those features function properly in the long term and under varied operating conditions. [In other words, a preflight checklist will be conducted of aircraft that have rarely or never actually flown, to declare them flight ready for the amateur pilot - ed.]

They note the obvious:

Meticulous testing of EHR products is critical to their safety. Because of the government’s lucrative incentive payments, many new vendors may attempt to enter the market and to quickly produce EHR systems whose quality is unproven and perhaps dubious.

A key passage in this article is this:

Admittedly, clinical evaluation of new products poses challenges for vendors who would need to find facilities willing to accept the administrative burdens of assessing systems that may ultimately fail. [In other words, it will cost them to improve the safety of their products, a core competency they should have developed decades ago - ed.]

Such facilities would also experience delays in receiving incentive payments because they would use uncertified systems during the evaluation period. However, certification of HIT that has not been thoroughly evaluated is no more responsible than approval of medications or devices that have not been carefully scrutinized by the FDA

"No more responsible than approval of medications not scrutinized by FDA" is quite on target. I personally would use a stronger term than deficient responsibility, however: deliberate reckless indifference to health IT safety seems more descriptive.

The authors do note that:

The delegation of EHR approval responsibilities to ATCBs will ease HHS’s regulatory burdens and likely supply an adequate pool of experts for HIT testing. HHS is authorized to monitor ATCBs through on‐site visits, reports, and review of documentation. It remains to be seen if these measures will ensure that ATCB members are qualified, competent, and free of conflict of interest. These issues will become more critical if HHS eventually requires rigorous clinical testing of EHR systems as described above.

Considering the track record of the pharma and medical device industries as presented in many case examples at this blog, "qualified, competent, and free of conflict of interest" is a tall order indeed for the health IT industry and its "certifiers."

The authors again state the obvious (albeit an "inconvenient truth"):

... it is naive to assume that any use of HIT is better than no use of HIT. [This warning echoes the "use equals success" fallacy as described by Karsh et al. in their recent JAMIA article described here - ed.]

EHR systems constitute complex technology that can introduce errors as well as prevent them. Medical errors can occur because of computer bugs, computer shut‐downs, or user mistakes that may be attributable to a flawed user interface. Through communication tools, electronic ordering, decision support features, and data management, EHR systems will guide many aspects of patient care. Treatment success will often depend on their proper functioning.

They conclude:

HHS’ new regulations constitute positive first steps and a laudable reversal of a relatively lawless approach to EHR system design and deployment. Previously, the only certification program was offered by the Certification Commission for Health Information Technology, a private industry group that was not subject to regulation.

I used terms such as "wild west", "out of control" and "pre-Flexnerian" to describe the HIT industry. That an attorney would use the term "lawless" seems quite fitting to describe similar observations.

Finally, they state:

Still, much more work must be done to protect public health in the digital era. We urge that future meaningful use and certification criteria and the post‐2011 permanent certification program be more attentive to safety issues.

EHR system approval should be no less rigorous than the FDA’s process for drug and device approval because HIT is as safety‐critical for patients. A prime criterion for certification should be a documented history of safe operations in a number of clinical environments.

The federal government would be wise to focus less on the speed of EHR adoption and more on product quality. Only through sufficient safeguards for EHR system safety can this technology fulfill its promise to dramatically improve individual and public health outcomes.

In my opinion, the HITECH timelines are far too ambitious (I sense a similar sentiment "between the lines" in the above passage). Perhaps it's time for those timelines to be revisited.

A new Congress should take that on, or defund HITECH and rewrite it to prevent patient harm and AHLTA and UK NPfIT-like debacles, to save billions of taxpayer dollars we really don't have to spare at the moment.

-- SS

Label