Publication

Article

MDNG Endocrinology

March 2007
Volume9
Issue 4

Technology Gone Bad? False Starts, Poor Design, and User Error in Health IT

New healthcare technologies promise to make the practice of medicine safer, more efficient, more transparent, and more effective. But what happens when things don�t go according to plan?

Information technology has become an increasingly important part of the healthcare industry. The latest tools and resources promise to not only decrease the workload for many healthcare professionals by making practice more efficient, but they also will enable practitioners to deliver higher quality care to their patients. Current technologies such as electronic health records (EHRs) have streamlined the process of collecting, storing, and updating patient information. However, it often seems that for every successful new application of technology, there is also one that does not meet expectations.

For some technologies, initial failure may only briefly derail their development. In other cases, the failures are beyond repair, and the project is eventually pushed aside to be replaced by a newer, better one. Technologies often fail because of poor designs that do not sufficiently take into account the needs and habits of the intended end user. This can be annoying or inconvenient when the technology in question is as trivial as a video game controller, but “design-induced errors in the use of medical devices can lead to patient injuries and deaths.” Because the stakes are so high, human factors engineering (HFE), “the study of human abilities and characteristics as they affect the design and smooth operation of equipment, systems, and jobs,” is becoming an increasingly vital design component in high-tech healthcare applications.

MDNG spoke with several experts who are actively involved in HFE and patient safety, and learned that some of the biggest contributors to technological failures are usability issues, poor implementation, the lack of evaluation before and after implementation, and inadequate application of HFE. When asked to identify an underlying reason for technology failures, Dr. Alex Kirlik, professor of human factors and acting head of the Human Factors Division at the University of Illinois in Urbana-Champaign, said, “[tech failures] highlight the fact that the performance of humans and technology are inexorably linked. But even more importantly, [failure] also reveals the very common attitude, shared by most technology designers and an increasing number of technology users alike: that if the technology fails, it’s the user’s fault, but in most cases, exactly the opposite is true. Those of us involved in human factors engineering have worked to successfully combat this attitude in areas such as nuclear power, and we are now trying to bring the same mindset to medicine and healthcare.” Kirlik believes that “behind many, if not most, human errors are faulty designs, whether they [are] bad computer interfaces or cultural workplace practices that compromise safety.”

Although many physicians have successfully incorporated a variety of high-tech tools into their practices—handheld computers, electronic medical records, automated billing—and tracking systems— many other physicians are hesitant to embrace new technologies, skeptical of designers’ promises of a better tomorrow. This reluctance is understandable. During the last decade, there have been many technological developments that were initially touted as being the first shots in the “healthcare revolution,” but that ultimately failed to live up to the hype. Voice recognition software systems and smart cards come to mind. In addition, computerized physician order entry (CPOE) technology, billed as the ultimate in patient safety-enhancing technology, has had some well-publicized ups and downs following implementation in several healthcare institutions.

Both Sides of the Coin

CPOE systems may play an integral role in improving healthcare safety. Unfortunately, conflicting studies have demonstrated drawbacks and benefits to this technology. In fact, some physicians believe that CPOE is the “potential troublemaker in health information technology,” not electronic medical records (EMR).

Dr. David Bates, Chief of the Division of General Medicine at Brigham and Women’s Hospital and Medical Director of Clinical and Quality Analysis for Partners Healthcare System in Boston, MA, told MDNG, “I think [CPOE] is one of the most important technologies for improving safety in healthcare today. That being said, it’s clear that it needs to be implemented carefully and implemented well for it to have the desired benefits. There’s evidence that if you implement it, it reduces the medical error rate by more than 80%, and it also can substantially reduce the cost of delivering care.” However, despite some initial encouraging results, there are still some documented problems associated with CPOE technology.

Discouraging findings from a study conducted at the Children’s Hospital of Pittsburgh in Pittsburgh, PA, were published in the December 2005 issue of Pediatrics. Researchers found that mortality rates increased from 2.80% to 6.57% within the first five months of CPOE implementation. Additional problems included usability issues and workflow problems, including “delays in obtaining needed medications because of how CPOE was designed to interact with other hospital systems and departments.” The Hospital of the University of Pennsylvania in Philadelphia had similar problems. Ross Koppel, PhD, a medical sociologist from the University, conducted a study to evaluate CPOE and identified 22 persistent types of error associated with a CPOE system that was used between 1997 and 2004.

Cedars-Sinai Medical Center in Los Angeles, CA, actually turned off its CPOE system. Providers were frustrated with the slow process of entering orders and the increased risks of the orders getting lost in the system. “I’m not opposed to change…but it’s got to be new and better,” said Dudley Danoff, MD, a urologic surgeon who helped organize physician opposition. “This was new but certainly not better” than paper.

Nine Unintended Consequences of Computerized Physician Order Entry

1. More/new work for clinicians

2. Unfavorable workflow issues

3. Endless demands by the system

4. Unwillingness to give up paper

5. Changes in communication patterns and practices

6. Negative feelings toward the system and those responsible for it

7. Introduction of new errors

8. Unexpected changes in the power structure such as the committee designing the computerized protocols making judgments about best practices

9. Overreliance on the technology

You Better Recognize!

When voice recognition technology was first introduced in 1994, there were high expectations that it would be the tool to assist healthcare providers in keeping their patients’ records up to date. It was important for the first voice recognition systems to be user-friendly and accurate to establish the technology within the healthcare industry, but instead of quickly and efficiently producing notes and charts with an easy-to-use system, available products’ lack of vocabulary and understanding of medical terminology frustrated physicians. Additionally, voice recognition systems require doctors to learn how to “talk” to the computer rather than the computer learning to “listen” to the doctor (even to this day), forcing physicians to adjust their manner of speaking in order for the system to process what’s been said. Considering the price tag of some early systems (the Voice Med system by Kurzweil cost a whopping $27,000 per workstation), it’s no wonder that physicians quickly began to question the value of a technology that ended up being more trouble than it was worth.

There are two types of voice recognition processes: discrete speech and continuous speech. Programs based on the discrete speech system require users to pause between words, because they can only recognize individual words one at a time, increasing the amount of time a physician spends dictating. Continuous speech programs enable physicians to speak normally, without pauses, and have been hailed as the “single most important improvement in the technology as far as breaking down barriers to user acceptance in the healthcare industry.”

Even with this breakthrough, voice recognition still hasn’t lived up to the initial hype. When asked for his opinion regarding voice recognition software, Dr. Bates said, “Voice recognition is a technology that’s been around for some time, and it’s not widely used in clinical care largely because the technology has not been sufficiently robust to work in a noisy clinical environment. It’s continually getting better and it works very well if the environment is a quiet one. It’s just been a challenge to get it to work in environments that are noisy. We offer it [at Brigham and Women’s Hospital], but it’s not widely used by our clinicians.”

Voice recognition technology continues to improve, assisted by the development of noise-canceling microphones and other new hardware, but as of 2004, only about 6,000 healthcare institutions within the US had implemented this technology.

Let Your Voice Be Heard!

Do you use voice recognition software in your practice and/or hospital? Have you used one of these programs in the past but discarded it? E-mail tkunkler@mdng.com to share your experiences.

Smarter Than the Average Card

Although smart cards have been used all over Europe and parts of Asia with relative success, the US has yet to embrace this technology. With hospitals, private practices, and clinics nationwide slow to implement EHRs and personal health records (PHRs) due to cost concerns, comparatively affordable smart card technology would seem to be an attractive alternative. However, some experts believe that the development of other, more convenient technologies (eg, healthcare websites) have surpassed the capabilities smart cards would have provided. Dr. Bates told MDNG, “I think you can achieve the same benefits one might achieve with the smart card by making available the same sort of information [in a] secure location on the Web, [because it] is sufficiently ubiquitous today. It seems to me that using smart cards will not be that helpful, given where we are on the technology front.” In this issue’s cover story, “The Plot to Bring Down WebMD,” we discuss how patients have the ability to create and access personalized healthcare profiles through a secure website, using Revolution Health and WebMD as examples.

“People talked about [smart cards] at one point as being one solution to the problem of discontinuity in care. The idea was that everybody would carry their history around on [what are] basically memory cards, but that really didn’t work,” says MDNG editorial board member Dr. Steve Nace, assistant professor of clinical medicine and associate program director of the department of medicine at the University of Illinois College of Medicine, Peoria. “With the smart cards, it wasn’t that technology failed. It was a good concept, [but] other, more convenient ways of sharing information have surpassed it.”

A “Smart” Idea?

Do you think smart cards still have a role to play in healthcare? Or do you agree with the physicians we interviewed that other technologies rendered smart cards obsolete? E-mail tkunkler@mdng.com with your thoughts.

Made in Error

The healthcare industry deals with medical errors on a daily basis, whether they be human or technological in nature. Information technology errors can happen for a multitude of reasons: someone forgets to enter information a system or does so incorrectly; leaves out a key step in a process or performs the steps in the wrong order; or just flat out uses a device incorrectly. The software developers, device manufacturers, and other tech companies that play key roles in modern healthcare are constantly working to enhance current designs and develop new technologies that can help reduce or eliminate errors. Part of that process includes designing medical devices that accommodate all healthcare professionals—no matter their ability level.

In order for technology to progress, it’s important that past mistakes are not repeated. When asked to share his opinions on why some technologies fail and how healthcare technology developers address those failures, John Gosbee, MD, MS, Human Factors Engineering and Healthcare Specialist University of Michigan Health System and Red Forest Consulting told MDNG that developers tend to “turn to the same people that they used for expertise to develop the first design” of a product or system. They “look at a problem and then try to solve it the same way they tried to solve it the first time. I think that’s the limitation, and therefore the next version or multiple versions that come out are using the same kind of model and same way of thinking.”

End users, too, have a part to play in determining whether new technologies succeed or fail; physicians and other healthcare professionals who intend to incorporate these tools and resources into practice must perform a pre- and postimplementation evaluation to ensure that proper processes and support are in place.

The moral of the story? Human error is, well… human. Design flaws and inefficiencies, poor evaluation and planning, and user error can combine to turn once-promising technologies into time-wasting, frustrating, failures. When this happens with a consumer gadget, you learn your lesson and move on (remember first-generation “universal” remotes?). There’s a lot more at stake when it comes to healthcare. Physicians can’t afford healthcare technology that creates more work for them; it’s important to get it right the first time. Technology is constantly advancing, with past failures providing valuable lessons that can lead to future successes. Surely, the best is yet to come.

Pharmacy Dollars Well Spent?Pharmaceutical companies spend hundreds of millions of dollars each year in research and development to find that one blockbuster combination that will lead to a big pay off. With literally hundreds of drugs in development, very few will receive FDA approval, no matter how good the ideas were in the beginning. Occasionally, even a medication that survives the research and review process and makes it to market will fail to have an impact. Below, you’ll find several examples of drugs that were at one time anticipated to be major players in the fields of oncology and cardiology but did not work as originally planned.

Torcetrapib

This treatment was developed by Pfizer to assist in raising HDL cholesterol when combined with Lipitor (atorvastatin)—which reduces LDL cholesterol—but was pulled from further clinical trials in December 2006 after researchers determined that “deaths in patients who received the combination were 60 percent higher when compared to those who were treated only with atorvastatin.”

NXY-059

This AstraZeneca Pharmaceuticals treatment for acute ischemic stroke was dropped after it “failed to show statistically significant reduction in stroke-related disability in a recent study.”

Telcyta (canfosfamide)

Telik, Inc. developed this drug to treat patients diagnosed with ovarian and lung cancers, only to see it fail three clinical trials. In one phase III trial, Telcyta did not improve the overall survival rate when compared to AstraZeneca’s lung cancer treatment, Iressa. Another trial tested the drug against ovarian cancer and found it “did not improve survival compared to the control therapies (liposomal doxorubicin or topotecan).”

Related Videos
Diabetes Dialogue: Tirzepatide’s Long-Term Obesity Data | Image Credit: HCPLive
Diabetes Dialogue: Latest Updates on Semaglutide Shortage, Data | Image Credit: HCPLive
Richard Pratley, MD | Credit: Advent Health Diabetes Institute
Rahul Aggarwal, MD | Credit: LinkedIn
Brendon Neuen, MBBS, PhD | Credit: X.com
HCPLive Five at ADA 2024 | Image Credit: HCPLive
Ralph DeFronzo, MD | Credit: UT San Antonio
Timothy Garvey, MD | Credit: University of Alabama at Birmingham
Atul Malhotra, MD | Credit: Kyle Dykes; UC San Diego Health
Optimizing Diabetes Therapies with New Classifications
© 2024 MJH Life Sciences

All rights reserved.