Again and again you see the statement that information technology will not only avoid medication errors but stop (or even reverse) the upward trend of healthcare costs.
How, I have often wondered, will entering data in computers have these magical effects? As a headline in the April issue of Health Affairs points out, “An opportunity will be missed if health IT simply automates a broken system.”
Since Health Affairs, the peer-reviewed journal of Project Hope, is apt to be both timely and authoritative, this point of view is not to be ignored. But perhaps another article in the same issue has an answer. It reports that at Kaiser Permanente, an HMO with a large enough patient base that its data can be considered trustworthy, switching to electronic health records has cut primary care visits by 25%. Or maybe patients are staying away because their doctors spend all their time looking at computer screens and not at them?
In another article in the same issue, the authors caution that after four years of experience, the effect on patient safety, while positive, is small, and they recommend that the investment in IT should be accompanied by an investment in the evidence base to permit definitive evaluation. But that hasn't stopped politicians from latching on to IT as a panacea.
For instance, President Obama has proposed that $20 billion be spent on health IT as part of his proposed economic stimulus package, while the Institute of Medicine has called for increased use of health IT to improve patient safety and to reduce cost. Writing in a previous issue of the same publication other authors ask rhetorically, “Can electronic medical record systems transform healthcare?” and their answer seems to be an enthusiastic “yes.” It may yield savings, they claim, “in excess of $142 billion per year.” They do admit, though, that “there is limited evidence linking health IT to specific improvements in health outcomes at a national level.”
A skeptical reader is tempted to observe that the same is true for improvements in reversing or even slowing down the rapidly rising cost of healthcare. A personal observation based on watching my internist, who is part of a wholly computerized medical group, hunting and pecking at his keyboard, is that data entry takes him longer than it would to update my chart the old-fashioned way. And if he makes a mistake every other member of his multi-specialty group will see it in my records and presumably accept it as gospel, because computers don't lie, do they?
But some skeptics are beginning to speak up. For instance, an article in a recent issue of the New England Journal of Medicine suggests that adopting existing forms of health record software could prove to be an expensive policy mistake. “If the government's money goes to cement the current technology in place, we will have a very hard time innovating in healthcare reform,” says the author, Kenneth Mandl, MD. And as if to show that he is not alone questioning the faddish consensus, the same issue reports that only nine percent of hospitals have adopted digital data entry.
Another article in Health Affairs concludes, “After a decade of experimentation, the path from health IT adoption to high-quality, high-value care remains largely uncharted.” To which a skeptical observer can only say “amen.”
Warren Ross is editor at large of MM&M