Steve Pearlstein has a good piece
in today's Washington Post on the failure, so far, of the health care sector to jump on the information-technology bandwagon (resulting in much waste and worse: avoidable death and injury). His analysis of the problem seems right on the money:
So why has health care almost uniquely failed to invest in IT? First, the industry remains fragmented, with few entities big enough to make the necessary sizable upfront investment. Even in cases where hospitals or doctors' practices might be large enough, the economic incentives are pretty weak. In an industry in which service providers are still paid largely on the basis of how much they do, investing in systems that would help reduce the number of tests and procedures isn't the most obvious way to boost incomes.
The networked quality of the health care industry, with independent doctors, hospitals, labs and pharmacies all providing services to the same patient, also discourages IT investment. Any economic gains wouldn't be fully captured by the entity making the investment, but would be likely to leak out to other providers or the insurer. And because the big payoff from such investments comes only after lots of other enterprises install the same system and make it possible for information to be easily shared, there's little incentive to be first.
Finally, there are the doctors, who still pretty much control the health care system and, up to now, have resisted anything that threatens to increase their workload, change the way they practice or limit their medical discretion. It is no coincidence that some of the earliest successes have come at Veterans Affairs hospitals, where doctors are salaried employees.
All of this raises an obvious question: what can the government do, through Medicare conditions of participation and through changes in reimbursement, to encourage the transition to a safer and more efficient system?