In the information era, the best of times are the worst of times. Computer hardware keeps getting faster, cheaper, and more portable; new technologies such as mashups, blogs, wikis, and business analytic systems have captured the imagination; and corporate IT spending has bounced back from the plunge it took in 2001. In 1987, U.S. corporations’ investment in IT per employee averaged $1,500. By 2004, the latest year for which government data are available, that amount had more than tripled to $5,100 per employee. In fact, American companies spend as much on IT each year as they do on offices, warehouses, and factories put together.
However, as IT’s drumbeats become louder, they threaten to overwhelm general managers. One of the biggest problems companies face is coping with the abundance of technologies in the marketplace. It’s hard for executives to figure out what all those systems, applications, and acronyms do, let alone decide which ones they should purchase and how to successfully adopt them. Most managers feel ill equipped to navigate the constantly changing technology landscape and thus involve themselves less and less with IT.
Adding to executives’ diffidence, corporate IT projects have often delivered underwhelming results or been outright failures. Catastrophes—such as the one at American pharmaceutical distributor FoxMeyer Drug, which went into Chapter 11 and was sold in 1997 when a $100 million IT project failed—may be less frequent today than in the past, but frustration, delay, and disappointment are all too common. In 2005, when IT consultancy CSC and the Financial Executives Research Foundation conducted a survey of 782 American executives responsible for IT, 50% of the respondents admitted that “aligning business and IT strategy” was a major problem. The researchers found that 51% of large-scale IT efforts finished later than expected and ran over budget. Only 10% of companies believed they were getting high returns from IT investments; 47% felt that returns were low, negative, or unknown.
Not surprisingly, any fresh IT proposal sparks fiery debates in boardrooms. Some boards say “Why should we bother? IT isn’t strategic, so it doesn’t matter in a competitive sense. We should be minimizing our technology expenditures.” Others argue “Whether IT matters or not, we shouldn’t be doing it ourselves. Companies are becoming virtual, and software is becoming rentable, so why do IT the old-fashioned way?” Thus, executives try to delegate, outsource, rent, rationalize, minimize, and generally remove IT from their already long list of concerns.