Photograph by Brendan Smialowski — AFP/Getty Images
By Kentaro Toyama
May 26, 2015

One of information technology’s great benefits, supposedly, is its ability to lower costs. Walmart, for example, is famous for its digital stock-keeping. Its databases know exactly what’s on the shelves, and they automatically inform suppliers which stores are low on stock. The system keeps inventories razor-thin and costs low. And it all seems to be about technology—databases, barcode readers, RFID-encoded pallets, and so on.

You might think, then, that some of our greatest cost-control challenges could be solved with IT. A conspicuous target in America is our health-care system. In fact, electronic medical records have firm bipartisan support even in an era of political deadlock. President Barack Obama has called for electronic medical records since before his days in the White House, citing efficiency and cost savings. And the GOP Doctors Caucus, formed by Representatives Phil Gingrey and Tim Murphy, states, “Health information technology has the potential to save more than $81 billion annually in health care costs. From drastically reducing medical errors to streamlining administration, health IT is the key to transforming our healthcare system.”

Unfortunately, whatever potential digital tools may have to lower costs, reality doesn’t always comply. And something that I call technology’s “Law of Amplification” explains why: Though Silicon Valley marketing claims that technologies have fixed effects—to make things faster, more efficient, or less expensive—their actual impact depends crucially on social forces that are already in place. Technology’s primary effect is to amplify, not necessarily to improve upon, underlying human inclinations.

In the American health-care system, very few people are really focused on reducing costs. As a result, every new technology is a white elephant—a “gift” you have to keep paying for. Many of us, sadly, are familiar with this state of affairs. A few years ago, I went to see a specialist in neuro-ophthalmology because I had lost partial vision in my right eye. After asking me some questions and peering into my pupil, the doctor said, “Well, there’s no clear problem, so it could be nerve damage. If it is, there’s not much we can do. But,” he said, smiling conspiratorially, “since you have good insurance, let’s do an MRI.” I agreed because I had no reason not to. I was lucky to have insurance with no co-pay. When I received the invoice for the visit, it showed $1,800 for the MRI. I was shocked, though grateful that my insurance covered it. The doctor’s office never called me for a follow-up, the MRI scan was never consulted, and my right-eye problems persist.

Unlike at Walmart, where digital tools amplify the company’s zealous pursuit of lower costs, in U.S. health care, technology intensifies all the ways in which spending is encouraged. Our hypochondria as patients, our foibles as doctors, our greed as suppliers, and our myopia as policymakers—all are social forces that the technology regrettably amplifies. Even the employers and governments that foot the bill cast their payments as benefits to employees and citizens. They don’t penny-pinch, for fear of appearing cavalier about people’s lives. On top of everything else, our metrics are off. As Princeton University economist Uwe Reinhart noted, “Every dollar of health care spending is someone’s health care income.” That income flows right into our national gross domestic product, and we want the GDP to rise, don’t we?

Of course, technology also amplifies good health-care trends, and that’s terrific for those of us with good health insurance. But if lowering costs is the goal, more technology isn’t a surefire solution. In the four decades since 1970—a period during which digital technologies poured into hospitals and clinics—American health-care costs rose in real terms by a factor of five. The increase has been far greater than in other developed countries. Information technology was probably not the main cause, but it certainly didn’t turn the tide. (Nor did we get what we paid for: American life expectancy during that period only increased by eight years. That’s fewer than the nine years gained in the United Kingdom and the 11 gained in Japan, even though they spent a lot less.)

So lower costs aren’t a function of the technology itself. If anything, digital technologies require additional upkeep. For example, I used to work at Microsoft, where the firm employed over 4,000 full-time people to keep its own IT systems running. That’s nearly 5% of the company’s workforce. (Similar proportions hold for any large technology company.) If technology companies—which work hard to automate everything—have to spend 5% of their human resources managing IT, imagine how much more costly it is for other organizations. Especially in the context of U.S. health care, digital tools just amplify what is already an outrageous system of accounting. Recent exposés show that patients are billed excessive prices routinely: $24 for a niacin tablet that comes to 5 cents at drug stores; $333 for a chest x-ray costing less than $30; $49,237 for a neurostimulator that wholesales at $19,000 and might cost only $4,500 to manufacture. In this climate, hospital administrators will be happy to install electronic medical records and pass on the costs to patients and taxpayers at a markup.

This lesson applies well beyond healthcare. Whether you’re considering a customer relationship management system to lower the cost of your sales operations, or a learning management system to streamline school administration, it’s worth it to think twice about whether your organization’s culture, incentives, and processes are already focused on cost control. If they’re not, more technology will add a line item to your budget that isn’t offset elsewhere.

Excerpted from GEEK HERESY: Rescuing Social Change from the Cult of Technology by Kentaro Toyama (on sale May 26, 2015). Reprinted with permission from PublicAffairs.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST