It’s been four years since the signing of the HITECH act as part of the economic stimulus package, and the United States has surely fallen short of the goal set by the Institute of Medicine in 2001 of every provider using electronic health record systems (EHRs) by 2010. And while the process of digitizing health information is itself a formidable task, federal regulations mandate that providers do more than simply convert their old paper charts: They must develop an IT infrastructure to be “used in meaningful ways” in the application of care and communication with patients.
Given the complex nature of medical information, as well as the vast quantities that providers must record and utilize, requiring information technology capabilities throughout the U.S. healthcare system makes a great deal of sense – in fact it’s critical to resolving some of the problems that have plagued America’s healthcare landscape.
However, the transition has been a slow one, and cost ranks first among barriers to physicians’ adopting new EHR systems. In addition to the expensive, time-consuming task of implementing an EHR, hospital and physician office staff members must also be trained to use the system properly, which can create a significant disruption to workflow and efficiency.
Despite the arduous process of implementation, the federal government remains committed to updating U.S. healthcare’s electronic data infrastructure with $27 billion set aside to help subsidize the process. Given the inevitability of the transition, it seems relevant to ask: who benefits most from EHRs?
While research is still relatively new in this area, the initial results are encouraging: patients seem to be deriving the most benefit from EHR implementation in the form of improved quality of care and reduction of medical errors.
So if EHRs lead to better medical care, what’s the rate of adoption in the U.S?
Adoption Increases, But Gaps Still Exist
Despite federal subsidies that offer as much as $44,000 annually per eligible provider from Medicare and $67,000 annually per eligible provider from Medicaid (based on their patient populations), as recently as February of this year, one study reported that only 1 in 6 doctors had qualified for federal incentives for the “meaningful use” of EHR systems. A more recent report released by the Office of the National Coordination for Health Information Technology indicated that adoption rate for EHRs at the physician level topped 50% in only a handful of states.
It’s undeniable the U.S. trails other developed countries in terms of EHR usage, but these numbers can be deceiving. Just because a physician or hospital didn’t qualify for Meaningful Use does not mean they aren’t employing an electronic system to manage their patients’ health information. Qualification for incentives depends on the manner in which they use that electronic infrastructure. Meaningful Use requires providers to expand their application of technology in multi-functional ways that will actually augment the way care is delivered, rather than simply changing the storage method.
So you may be wondering…
What Kinds of Functions Qualify as Meaningful?
- The generation of patient information, like a list of past medical procedures
- The creation of patient registry information, for example a list of patients who could benefit from or are due for preventative care
- The online prescribing of medications
- Clinical decision support (CDS), which entails alerts about possible adverse drug interactions and information regarding current best practices for treating a particular ailment
- Capturing specific performance related data and reporting said data to federal regulators
With this clarification in mind, it’s easier to separate which providers rely on EHRs simply for data storage and which are striving to use the true potential of the available technology. By Kaiser Permanente’s estimates 69% of primary care doctors were using EHRs in 2012, up from 46% in 2009. However, only 27% of the same primary care physicians met Meaningful Use criteria.
Here’s how those numbers compare to some other countries around the globe:
So where does the U.S. sit? It appears somewhere in the middle. While our use of multifunctional electronic health systems exceeds some well-developed countries, it is markedly behind others.
Now that we know how the U.S. is progressing, let’s examine why patients should care about the movement to electronic records.
EHRs Improve Quality of Care
The slow adoption of EHR systems in the United States can be contributed to numerous causes, most of which have to do with the financial difficulty placed on doctors and hospitals (implementing an EHR at a large facility can take several years and cost millions of dollars) the unwillingness of older doctors to change their habits, and the major disruption such a widespread change can have to workflow.
Basically, healthcare providers aren’t necessarily the ones benefiting the most from EHRs. Of course they realize it’s necessary to modernize the lagging U.S. health system, but it’s also a tremendous amount of trouble to pull off. Which is precisely why data on the effects of meaningful EHR implementation and usage are so interesting.
Once EHRs are implemented and used in meaningful ways, the quality of care delivered to patients significantly improves across a variety of areas. For example, EHR implementation in a hospital makes the communication of vital patient information much easier. A primary care doctor who refers a patient to a specialist within the same hospital system can easily share that patients’ medical history with the specialist, reducing the chances of treatment that would result in an adverse event.
Even this simple sharing of patient information between providers in the same hospital system can have dramatic effects.
In a study conducted by the Commonwealth Fund, Sentara health system utilized its EHR to better coordinate its care processes, which made patient discharges more predictable. This improve “throughput” or patient flow resulted in a 90 minute deduction in the time it took to assign a newly admitted patient to a bed – and an 80% reduction in the bed assignment time for ER patients.
Sentara also found EHR usage for prescribing drugs and ordering tests resulted in an 11% decrease in medical errors – errors that can often be attributed to poor penmanship on written orders, loss of pieces of medical information in the transition of records or unchecked human error.
While the Commonwealth Fund’s study focused on what they termed “leading hospitals”, or hospitals that had already committed to advancing their IT capabilities before the HITECH act, the improvement in quality of care can be found in other facilities too.
By implementing an EHR with a well-developed clinical decision support system, studies have found medical errors can be reduced by as much as 83%. Furthermore, the occurance of redundant laboratory tests – which are costly to patients and can increase the likelihood of false-positives – dropped by as much as 24% in a hospital setting.
EHRs help physicians stay on course with best practices as well. CDS applications utilize pop-ups that alert the physician to a deviation from standard practice guidelines. Another Kaiser Permanente study revealed that emergency room visits decreased by 5.5% and hospitalizations by 5.2% for diabetic patients annually after the implementation of Kaiser’s EHR system. Physicians analyzed the data gathered by Kaiser’s system to spot gaps in best practices and uncover new beneficial treatment patterns for their diabetic population.
The implementation and meaningful use of EHRs is no simple feat for healthcare providers. When done improperly or with a poor system, implementing an EHR can actually increase the number of medical errors. And even with federal subsidies alleviating the financial expenditures associated with costs, providers still expect to pay a great deal of money when purchasing and implementing an EHR system – expenditures which an EHR will not help them recoup in the short term.
Though imperfect, EHRs can and have had an overall beneficial effect on the quality of care for patients. And while evidence suggests that they don’t necessarily lower the cost of care, if these systems improve the health of patients then the huge push for making their usage mandatory is justified.