New tech is usually adopted not for direct productivity gains, but to signal one is in fashion, one is technically capable, etc. From a Post Oped Tuesday:
President Obama's proposed health-care reforms include investing $50 billion over five years to promote health information technology. Most notably, paper medical records would be replaced with linked electronic records to try to improve quality of care and lower medical costs. The recently enacted stimulus package included $20 billion for health IT. …
Yet while this sort of reform has popular support, there is little evidence that currently available computerized systems will improve care. … Large, randomized controlled studies — the "gold standard" of evidence — in this country and Britain have found that electronic records with computerized decision support did not result in a single improvement in any measure of quality of care for patients with chronic conditions including heart disease and asthma. …
They do little to prompt greater and more widespread health-care practices that are known to be effective. Health IT has not been proven to save money. Moreover, personal financial ties have been found between some researchers and the companies that produce these systems, and as far back as 2005 studies have shown that health IT developers are about three times more likely to report "success" than evaluators who had no part in system development.
What's more, evidence suggests that adoption of some computerized systems has not helped but harmed patients. After the Children's Hospital of Pittsburgh added automated prescribing recommendations to a commercial electronic records system, the institution documented a more than threefold increase in the death rate among child patients. Another leading system contributed to more than 20 different types of medical errors.
I emailed one of the authors for cites, and they check out; this is solid. Economists are familiar with this scenario:
In 1987 Robert Solow, a Nobel Prize-winning economist, famously said: “You can see the computer age everywhere but in the productivity statistics.” It was only in 2003 that The Economist felt comfortable boldly proclaiming: “The 'productivity paradox' has been solved.”
As Dr Solow observed, most countries saw productivity growth slow in the 1980s and early 1990s, just as computers were becoming widely used. Techies grumbled, economists sharpened their pencils — and businessmen ignored the argument and went on buying the kit. But the conclusion was clear: new technologies on their own do not raise productivity. Rather, companies and individuals must figure out how to make best use of them in order to reap their rewards.
There are important lessons in this for the ongoing debate about broadband. … The OECD released its latest report on May 19th. It surveys the broadband landscape to December 2007, and tells a warm tale. …. But the excellent report … examines why broadband is actually useful. And here the authors face a problem: there simply is not good data to show that broadband matters. … The most innovative country in the OECD as measured by number of start-up internet companies that rise to global prominence is America, which has a mediocre standing on the OECD broadband rankings. …
Paul David, an economist at Oxford University, has shown that electric power, introduced in the 1880s, did not immediately raise productivity. Not until the late 1920s—when around half of America's industrial machinery were finally powered by electricity—did efficiency finally climb. … Although policymakers and the public might feel that super-fast broadband is essential, that view is based more on faith than fact.
Of course one could argue that adopting tech early for signaling reasons benefits everyone else by helping the tech to evolve. Could be, but sounds an awful lot like wishful thinking.
Unsupervised: The failure modes you discuss have an easy workaround: print out the records. Any reasonably well-designed electronic system can easily fall back on non-electronic methods. At that point, you have no advantage from the electronic system, but in time it would be limited to special cases. As the sophistication of EMR systems grows, compatibility between different hospitals will become less of a problem. Individual EMR products will gain increased ability to translate between different document formats. Check out a piece of modern word processing software. The format reading and writing options are staggeringly numerous, because compatibility is an important component of the word processor's job.
There are two stages to the adoption to any new technology: the assisting phase and then the integrating phase.
In the first phase, the new technology is used to improve existing designs and processes. For example, the Romans used iron instead of bronze for swords and spearheads, but the basic conduct of war remained the same. IN the digital age, early on improvements were made because of things like not having to send a clerk down to the file room to get records, they could be pulled up nearly instantaneously on a terminal.
IN the second phase, new processes and designs are created that could not have been achieved using the old technology. For instance, iron swords became longer, narrower, and for their size lighter, making them more easily used on horseback, and thus the medieval knight was born. IN the example of electricity, it wasn't until new tools were developed that took advantage of the ability to place the power source within the tool itself (replacing the system of belts and pulleys powered by a steam engine or water wheel) that the productivity gains were achieved.
With computers and information now medicine is just beginning to move into the second phase. In order to truly achieve the gains from EMRs and other new technologies, the very process of medicine is going to have to change. What form that will take, I don't know, and anybody who says they do is either a fool or about to get very rich (possibly both).