I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think.
I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages.
I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle. I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes.
A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.).
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist pointed out in the 1960s, media are not just passive channels of information.
They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon., recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be a voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”, also has described how the Internet has altered his mental habits.
“I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online.
“I can’t read anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. Anecdotes alone don’t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K.
Educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it.
The authors of the study report: It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of.
“We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged. Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains.
Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli.
We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works. Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.
But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.” “You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar, Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.” The human brain is almost infinitely malleable.
People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”. As we use what the sociologist Daniel Bell has called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example.
In Technics and Civilization, the historian and cultural critic described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.” The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock. The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor.
Thanks to our brain’s plasticity, the adaptation occurs also at a biological level. The Internet promises to have particularly far-reaching effects on cognition. In a, the British mathematician proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that’s what we’re seeing today.
The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration. The Net’s influence doesn’t end at the edges of a computer screen, either.
As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, The New York Times, its design director, Tom Bodkin, explained that the “shortcuts” would give harried readers a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules. Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Microsoft excel 2016 - download.
Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure.
About the same time that Nietzsche started using his typewriter, an earnest young man named carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant’s machinists. With the approval of Midvale’s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an “algorithm,” we might say today—for how each worker should work. Midvale’s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared. More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor’s tight industrial choreography—his “system,” as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world.
Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.” Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.”.
Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.” Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind. The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency.
Nicholas Carr Articles
The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers. Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back.
“For us, working on search is a way to work on artificial intelligence.” In a, Brin said, “Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large scale.”. Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt’s words, “to solve problems that have never been solved before,” and artificial intelligence is the hardest problem out there. Why wouldn’t Brin and Page want to be the ones to crack it? Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation.
Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive. The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. Desh bhakti songs dj mix mp3 free download.
The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction. Maybe I’m just a worrywart.
Just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine. In Plato’s, Socrates bemoaned the development of writing.
A year ago, Harvard Business Review published a now infamous article called “IT Doesn’t Matter.” Its author, the magazine’s then executive editor Nicholas G. Carr, argued that information technology no longer gives businesses a competitive edge. Carr called information technology managers impatient, wasteful, passive, and lured by the chorus of hype about the so-called strategic value of IT. Harvard Business Review has 243,000 extremely influential readers. So if it publishes an article saying that information technology doesn’t matter, then an awful lot of important business leaders are going to believe it. And if they do, they’ll run their companies-and our economy-into a ditch.
Since I do not subscribe to the ink-on-dead-trees version of the magazine, I bought my copy of Carr’s May 2003 paper through Amazon.com. It was delivered over the Internet in minutes as a PDF file for $7.00. Carr’s new book is also listed on Amazon.com, a triumph of IT-enabled corporate strategy.
We see that IT apparently matters to Harvard. Carr himself has a website, nicholasgcarr.com. IT apparently matters to Carr. Let’s face it: IT matters to everyone. Two Trillion Reasons that I.T. Matters I asked how much IT matters of Frank Gens, senior vice president for the information technology market research giant IDC. (Full disclosure: IDC is owned by IDG, on whose board I serve.) IDC reports that the global investment in information technology (including telecommunications) totaled $1.9 trillion in 2003 and, despite Carr, will climb to $2.0 trillion in 2004.
According to a 2003 IDC survey, non-IT business executives spend 20 percent of their time thinking about IT. Are they wasting their time? Again despite Carr, almost 60 percent say that the strategic importance of IT is increasing; only 2 percent say the importance is decreasing.
Carr may claim these Harvard-MBA-type executives are foolish or misguided, but 55 percent feel that their companies should use information technology more aggressively; 43 percent feel their usage is just right; and only 2 percent feel that they should be less aggressive. In Carr’s world, information technology managers are apparently fools, or even frauds, to the tune of $2 trillion per year. Presumably, these managers slavishly upgrade to whatever new thing vendors want to sell. But in the real world, millions of people already work hard to spend their IT budgets wisely. The computer-trade press has been covering this complicated process for almost 40 years. In warding off his debunkers, Carr has offered some clarifications of his argument.
He doesn’t really mean that information technology doesn’t matter; rather, he says, his point is that because IT has been commoditized, like electricity, it confers upon its business users no competitive advantage. He also clarifies that he does not mean that information itself doesn’t matter, nor does he mean that the people using the technology don’t matter. What really doesn’t matter, he says, is the no-longer-proprietary technology infrastructure for storing, processing, and transmitting information. So we can only hope that most of Harvard Business Review’s captains of industry read beyond the article titles before dropping the magazine on their coffee tables.
Carr concludes that since information technology no longer provides a competitive advantage to businesses, they should stop spending wildly on advanced information technology products and services. He admonishes managers to stop being suckers for the latest cool products from Cisco, Intel, Microsoft, Oracle, et al. IT managers should stop squandering corporate assets and begin acting in the best interests of their shareholders.
They should become boring minimizers of IT cost and risk. As evidence, Carr points out that my 30-year-old baby, Ethernet, has been standardized and commoditized. It’s true that last year more than 184 million new Ethernet ports were shipped, at a value of $12.5 billion, and that anyone can buy them. Most of those ports are the current mainstream version of Ethernet, which carries data over wires on local-area networks at 10 or 100 megabits per second. But now that the post-Internet-bubble nuclear winter is almost over, Ethernet is speeding up, to beyond 1,000 megabits (one gigabit) per second. Ethernet is going into wide-area networks. It’s going wireless.
It’s going into embedded systems-the eight billion microprocessors shipped every year that don’t go into PCs. New Ethernet standards are being created, new commoditization races are being started, and Ethernet, if ever it wasn’t, is once again a tool of corporate strategy. In the article and now again in his book, Carr wrongly equates today’s information technologies with electricity, and then he wrongly characterizes electricity as static. In short, Carr, deep into a post-bubble depression, wrongly declares the end of history. The history of electricity is not over, however.
Controlling electrical power grids is still famously problematic, and that’s to say nothing of the exciting developments in technologies such as wind, solar, fission, fusion, hydrogen, and batteries, all of which present strategic opportunities. And information technology is bigger and more recent than electricity. Both are still rapidly evolving; both are very much alive as important elements of corporate strategy. Much of the research on information technology usage that Carr cites is of dubious validity. Take, for example, the studies that, as Carr puts it, “consistently show” that expenditure on IT as a fraction of company revenue is inversely correlated with financial performance.
One study that Carr cites states that the 25 companies with the highest economic returns spent on average just.8 percent of revenues on IT, while the typical company spent 3.7 percent. But this hardly proves Carr’s conclusion. Rather, it indicates that companies investing wisely in IT increase revenues much faster than those that invest unwisely, too little, or not at all. Companies that invest poorly in IT don’t increase revenues as quickly, so their IT expenditures are higher as a fraction of revenue. Companies that invest unwisely in IT go out of business and are not counted in the studies.
IT still matters. Raining On The I.T.-Bashers’ Parade Carr is not the first person to question the value of information technology. Paul Strassman, for example, despite being a high-profile, big-budget chief information officer for such organizations as NASA, the U.S. Department of Defense, and Xerox, has made a second career of studies not finding the benefits of IT.
Morgan Stanley economist Stephen Roach is another famous critic of IT. During the 1990s, he claimed that increasing investments in information technology were showing no benefits. Roach, echoing MIT economist Robert Solow, wrote that IT investments were not appearing in U.S. Productivity numbers. I called Solow, a Nobel Prize winner, and he admitted that this so-called productivity paradox might easily be explained by how poorly productivity is measured. Productivity numbers are hard to come by, and Roach relied on outmoded methods. But Roach stuck by his IT-doesn’t-matter numbers, like the proverbial drunk looking for his wallet under a street lamp.
Today, information technology accounts for about half of capital expenditures by U.S. Productivity is high and increasing rapidly. What is Roach saying now? He says that the productivity numbers are highly questionable. In other words, if the data conflict with your theory, throw out the data. It makes me wonder whether Roach, like Carr, just has a bad attitude about IT. In Carr’s reply to early critics, published on the Web by the Harvard Business Review in June 2003, he wrote that his article “has at least succeeded in setting off an important and long-overdue debate about the role of information technology in business.” I don’t think so.
If anything, Carr has succeeded only in misleading his readers. Howard Smith and Peter Fingar, in their 2003 book IT Doesn’t Matter-Business Processes Do, argue that Carr is not only wrong but dangerous.
They remind us of what happened when Harvard Business Review published Michael Hammer’s 1990 article “Reengineering Work.” Too many Harvard MBAs decided to take the easy part of Hammer’s advice and downsized their companies to death. Unless Carr’s argument is debunked, the current crop of reigning MBAs will be tempted to run WordPerfect on mid-1980s PCs connected to IBM 360 mainframes. Which brings us to Carr’s central conceit.
He urges IT managers not to venture foolishly out onto technology’s cutting edge and to buy only that which has low risk and high value to their companies. Carr urges this as if it were breaking news.
In fact, IDG alone publishes 300 information technology magazines worldwide, and each has several competitors. All of these have been offering advice for decades on just how far onto the bleeding edge of technology it is wise to go to give your company an edge. Taking technology risks, when done well, can bring competitive advantage. When done poorly, it can bring disaster.
But that’s a balancing act that the information technology managers of the world were well aware of long before Carr put in his two cents. We often brag about the marvelous U.S. Innovation machine. We brag about our world-leading research universities. We brag about our entrepreneurs and the venture capitalists, like me, who back them. But there is an unsung player in our marvelous innovation machine: the aggressive users of information technology. In Germany, by contrast, it’s hard to buy IT unless it’s from Siemens.
In the United States, startups readily find managers out on the cutting edge, searching for new, smarter, and more efficient ways to do things-a quest that keeps our vaunted innovation machine humming. If business executives follow Carr’s advice, who will provide innovation’s test beds? How will new technologies find their markets? This may be the most important reason to debunk Carr’s arguments once and for all: if they harden into conventional business wisdom, American ingenuity will be strangled in its bassinet. I serve on the board of a small public company in Silicon Valley called Avistar.
For 10 years, Avistar has been marketing networked desktop videoconferencing to large companies. Avistar’s hardware and software have worked increasingly well for a long time.
What’s taking time is their adoption-the search for one situation after another in which the technologies provide a value that’s worth the risk. Avistar CEO Jerry Burnett disagrees strongly with Carr and recommends a division of labor in IT management. On one hand are specialists in what Burnett calls “availability management.” These might be mistaken for the cost and risk minimizers that Carr extols. On the other hand are specialists in “adoption management.” These are the people Carr wants demotivated, demoted, or fired. Carr argues that things that are widely available, like IT, cannot be used for sustained competitive advantage.
Well, since Harvard Business Review is received by almost a quarter-million people and can be bought by anyone with $16.95, then according to Carr’s own argument, that publication itself doesn’t matter. Cancel your subscription and download any interesting articles from back issues-which any teenager will be able to find for you on the Internet for free.
Over the last decade, and even since the bursting of the technology bubble, pundits, consultants, and thought leaders have argued that information technology provides the edge necessary for business success. IT expert Nicholas G. Carr offers a radically different view in this eloquent and explosive book. As IT's power and presence have grown, he argues, its strategic relev Over the last decade, and even since the bursting of the technology bubble, pundits, consultants, and thought leaders have argued that information technology provides the edge necessary for business success.
IT expert Nicholas G. Carr offers a radically different view in this eloquent and explosive book. As IT's power and presence have grown, he argues, its strategic relevance has actually decreased.
Nicholas Carr Does It Matter Pdf
IT has been transformed from a source of advantage into a commoditized 'cost of doing business' - with huge implications for business management. Expanding on Carr's seminal Harvard Business Review article that generated a storm of controversy, Does IT Matter? Provides a truly compelling - and unsettling - account of IT's changing business role and its leveling influence on competition. Through astute analysis of historical and contemporary examples, Carr shows that the evolution of IT closely parallels that of earlier technologies such as railroads and electric power. He goes on to lay out a new agenda for IT management, stressing cost control and risk management over innovation and investment. And he examines the broader implications for business strategy and organization as well as for the technology industry. A frame-changing statement on one of the most important business phenomena of our time, Does IT Matter?
Marks a crucial milepost in the debate about IT's future. An acclaimed business writer and thinker, Nicholas G. Carr is a former executive editor of the Harvard Business Review. A hundred years ago, many large companies created the 'Vice President of Electricity' to strategically deal with the opportunity that this energy source was providing to the corporations. Some years after, electricity became a commodity and it naturally fell off in the management agenda.
The book from Nicholas G. Carr 'Does IT Matter?'
Deals with examples like to envision if IT will be having a similar fate in our companies. Carr's book extends the point of view that he originally presented in a A hundred years ago, many large companies created the 'Vice President of Electricity' to strategically deal with the opportunity that this energy source was providing to the corporations.
Some years after, electricity became a commodity and it naturally fell off in the management agenda. The book from Nicholas G.
Carr 'Does IT Matter?' Deals with examples like to envision if IT will be having a similar fate in our companies. Carr's book extends the point of view that he originally presented in a well known article from Harvard Business Review in 2003.
Entitled as “IT Doesn't Matter”, it started a very polemic debate among IT professionals and users of information technology, that endures to date. Easy to read and put the right questions thru a proper background. A smart starting point for a fruitful discusion in the IT departments of today's companies on the mission and value proposition of this area.
I originally was assigned this book as the 'textbook' for one of my college classes. I was happy with the end result.
Carr uses this book to expand on his original article about how information technology (IT) is losing its usefulness as an advantage in business because of its (brace for Carr's favorite word) ubiquity. It's a good read, and although some of the references are slightly dated, the book's main thesis is still a valid point in today's business world. While IT is still important for I originally was assigned this book as the 'textbook' for one of my college classes. I was happy with the end result. Carr uses this book to expand on his original article about how information technology (IT) is losing its usefulness as an advantage in business because of its (brace for Carr's favorite word) ubiquity.
It's a good read, and although some of the references are slightly dated, the book's main thesis is still a valid point in today's business world. While IT is still important for businesses to embrace, its role as a source of competitive advantage has been diminished. The main premise of the book is best summarized by the author in the preface: 'Through an analysis of its unique characteristics, evolving business role, and historical precedents, I will argue that IT's strategic importance is not growing, as many have claimed or assumed, but diminishing. As IT has become more powerful, more standardized, and more affordable, it has been transformed from a proprietary technology that companies can use to gain an edge over their rivals into an infrastructural te The main premise of the book is best summarized by the author in the preface: 'Through an analysis of its unique characteristics, evolving business role, and historical precedents, I will argue that IT's strategic importance is not growing, as many have claimed or assumed, but diminishing. As IT has become more powerful, more standardized, and more affordable, it has been transformed from a proprietary technology that companies can use to gain an edge over their rivals into an infrastructural technology that is shared by all competitors.
Information technology has increasingly become, in other words, a simple factor of production—a commodity input that is necessary for competitiveness but insufficient for advantage.' The outline of the book is as follows: 'I open with a brief introductory chapter, 'Technological Transformations,' that provides an overview of my thesis and underscores the value of examining IT from a strategic perspective.
I stress in this chapter what I see as the central—and positive—message of this book: that IT's transformation from a set of proprietary and heterogeneous systems into a shared and standardized infrastructure is a natural, necessary, and healthy process. It is only by becoming an infrastructure—a common resource—that IT can deliver its greatest economic and social benefits. Every IT manager and especially every senior IT manager should read this book. It blows up the myth-making that vendors (and I work for one of those vendors) do regarding how IT can give you a competitive advantage, when outside of several narrow industries (hedge funds and investment banks can get an edge on the competition for example) IT is a commodity input to production and you can't do better than your competition, therefore, the author advocates not trying to be a first-mover on new techn Every IT manager and especially every senior IT manager should read this book. It blows up the myth-making that vendors (and I work for one of those vendors) do regarding how IT can give you a competitive advantage, when outside of several narrow industries (hedge funds and investment banks can get an edge on the competition for example) IT is a commodity input to production and you can't do better than your competition, therefore, the author advocates not trying to be a first-mover on new technology.
Let another firm blaze the trail then follow right behind. Let them absorb the higher costs. Nicholas Carr is the author of the Pulitzer Prize finalist The Shallows, the best-selling The Big Switch, and Does IT Matter? His acclaimed new book, The Glass Cage: Automation and Us, examines the personal and social consequences of our ever growing dependence on computers and software. Former executive editor of the Harvard Business Review, he has written for The Atlantic, New York Times, Wall S Nicholas Carr is the author of the Pulitzer Prize finalist The Shallows, the best-selling The Big Switch, and Does IT Matter? His acclaimed new book, The Glass Cage: Automation and Us, examines the personal and social consequences of our ever growing dependence on computers and software.
Former executive editor of the Harvard Business Review, he has written for The Atlantic, New York Times, Wall Street Journal, and Wired. He lives in Colorado. Author photo by Merrick Chase.
In 2003 Nicholas Carr wrote a provocative article for HBR titled “,” in which he stated: “IT is best seen as the latest in a series of broadly adopted technologies that have reshaped industry over the past two centuries — from the steam engine and the railroad to the telegraph and the telephone to the electric generator and the internal combustion engine. For a brief period, as they were being built into the infrastructure of commerce, all these technologies opened opportunities for forward-looking companies to gain real advantages. But as their availability increased and their cost decreased — as they became ubiquitous — they became commodity inputs. From a strategic standpoint, they became invisible; they no longer mattered.” This argument was derided by IT supply-side executives such as Steve Ballmer, Carly Fiorina, and Scott McNealy, but CEOs quietly applauded it.
They had suspected all along that IT really doesn’t matter. Company leaders have quoted and lauded Carr whenever they’ve needed to justify their hesitation to create strong, progressive IT positions. And they hesitate to create strong, progressive IT positions all the time. In fact, CEOs avoid IT like the plague. They resist getting their hands dirty alongside the CIO, even though many of them will readily get down into the mud of a balance sheet with the CFO or strategize the details of global brand issues with the CMO.
Because they distance themselves from IT, CEOs don’t grasp its subtleties. Nor do they understand the CIO’s role or, typically, the technologies that the company deploys.
Consider the meager corporate progress over the past decades in easing two long-running headaches: enterprise-computing implementations and corporate security. Even after more than 20 years of implementations, a study by shows that 53% of ERP projects still run over budget, 61% take longer to complete than anticipated, and more than 27% fail to produce the positive ROI expected. And recently published data indicating that at a basic level, more than 27% of computers are infected with malware. Are on the rise, with a 44% increase in the number of records exposed from 2011 to 2012. There have been breaches at companies such as (1.5 million records), (600,000 credit cards), (1.5 million passwords), (6.5 million passwords), (24 million records), (160 million credit card numbers), and even the (3.5 million records). Social media is now part of the security picture too: Between Q2 and Q4 in 2012 the number of grew 40%, but the growth was accompanied by hacks such as those at The Associated Press, the FT, Human Rights Watch, France 24, the BBC, and Burger King. All of which reveal a deficit of security measures and a poor contextual understanding of the technology.
These and other IT-related problems aren’t rooted in technology but in leadership failings. The people in the C-suite don’t understand IT problems, don’t provide adequate resources to solve them, and don’t approach the issues as members of unified technology-literate teams. To address these shortcomings, companies can take action in three areas:. Literacy. The senior leadership needs to become literate in technology.
IT isn’t somebody else’s job, it’s ultimately theirs. Boards should require that CEO candidates demonstrate not just knowledge of finance and marketing but also a technology aptitude. Accountability. Boards should make CEOs accountable for technology failures and data breaches. The compensation committee should push for clearer links between pay and performance for IT-related activity (which ultimately is nearly everything most firms do). These links should be described clearly in the annual report so that analysts can scrutinize them.
The senior leadership group mustn’t just pay lip service to the CIO and his or her team. The CIO’s group is at the core of the business; it runs the company’s nervous system (ERP) and immune system (security) and connects all internal and external entities. Technology updates should be provided to the senior management team with the same frequency and rigor as financial statements and signed off on by the leadership team as part of the pay-for-performance framework. It’s true, in a sense, that enterprise computing is like a utility.
Data flows through every company like water, gas, and electricity. But there’s a difference. Computing’s functionality undergoes constant, dramatic increases, and as it does so, it opens huge new opportunities and leaves the company vulnerable in unexpected ways. While technology can’t give you a permanent competitive advantage, timely deployment of new IT products, processes, and systems can enable you to build a strong competitive position. Corporations’ technology strategies will remain ineffective until leaders acknowledge that, now as always, IT does matter.
Follow: It's 2003. IT is becalmed, in the doldrums, in limbo. The flurry of activity and inexplicable spending that was the tsunami has long since blown over. Some still refer to the millennium bug as IT's finest hour. Others believe it was a totally manufactured crisis. Either way, in 2003, the business side of the house is no longer supporting any of IT's fantasies.
The Standish Group has just published the results of five years of analysis on the failure rate of IT projects: A depressing 65% of IT projects fail. The dominant project methodology is (popular since the 1970's). Looking at the breakdown of the average IT spend, approximately 20-25% is committed to keeping everyday IT operations up and running, with the remainder going to innovation and new solutions for the business. The business is reacting to Y2K and pushing for more stability, availability and reliability from their IT systems. The business is demanding real business value and ROI from their IT spend, not pie-in-the-sky touchy-feely measurements. IT is struggling to respond.
Struggling to translate real IT performance gains into demonstrable value for the business. Struggling to understand its role in the confusing new millennium. Struggling, in some way, to justify its very existence. Around this same time, a small group of unconventional programmers gathered at a ski resort in Utah in October 2001 and. More about this milestone in the evolution of IT later Economically, in 2003, we are in a downturn.
The collapse of Enron and WorldCom has vividly demonstrated that no company is 'too big to fail.' These scandals lead to the passage of the in 2002, which contains provisions where key executives can now be sent to prison for falsifying the financials of public companies. IT is to play a major role in the financial reporting systems of public companies. In addition to the heavy operational focus of most IT shops, new compliance requirements are added to IT shops already heavily burdened with a plethora of audits. Politically, in 2003, we are reeling from recent events.
Our first presidential election of the new millennium took six weeks to decide because of the method used by Florida to collect simple data ('). The headlines in Europe call it a real 'Mickey Mouse' operation. Less than two years prior, 9/11 had caused us all to question everything. Even the long-vaunted FBI had been caught with its IT computing pants down. The nation's top cops, famed for their ability to gather and sift through huge volumes of information, are exposed as laggards in 2001, dependent on outdated systems that do not have a prayer of keeping up with the exponentially increasing demand. Systems that were linked - at least, indirectly - to the domestic intelligence failures leading up to 9/11. We went from invincible to vulnerable in the span of one sunny Tuesday morning.
It is a difficult time for the U.S. And a difficult time to be in IT.
IT operations is still operating under the old rules. The 2003 rules.
Nothing has changed for them. Well, that's not quite true.
Nicholas Carr Essays On Technology
They are trying to apply the 2003 rules to a world with smartphones, Wi-Fi and ubiquitous cloud environments. As if things couldn't get any worse for IT, in the May 2003 issue of the Harvard Business Review comes an article by Nicholas Carr entitled, '.' The premise is simple and direct: IT, like so many other technological innovations before it, has become a commodity. IT no longer represents a strategic business advantage. No longer can one organization exercise technological dominance using IT as its lever. Now, every organization has roughly the same IT: networks, routers, servers, databases, websites, email and so on.
Carr further points out that, now that IT has reached this stage of commoditization, it represents a major risk to the business enterprise: The risk of not being there. IT outages that were mere annoyances in the past now place the business at a significant disadvantage. This development in the evolution of IT was not just a shift in thinking, but a strategic shift in where you put your IT spend:. In fact, this fundamental shift in thinking (and spending) was well underway.
Every organization on the planet had been caught on the horns of this dilemma three years earlier during the Y2K mess. But there was simply far too much to do at the time and no downtime to think about why we were doing it. Y2K was the embodiment of Carr's basic premise: IT represented a strategic risk for the business. The industry's response to Carr's article was swift and pointed.
To debunk the premise that IT and its technology were little more than 21 st century plumbing. The overriding theme of the responses to Carr's article can be paraphrased as, 'Sure, we all have the same technology, but it totally depends on what you do with it! Look at Cisco and Wal-Mart and Dell - they are innovative.' (Remember, this is 2003!) So, the real question is: Was Nicholas Carr right in 2003? Is he right today? It's time for a change!
Fast forward to 2017. What has changed? The Standish Group has now been measuring the failure rate of IT projects for 20 years. The IT project failure rate is still at the depressingly low rate of 65%, with very little fluctuation from year to year. The dominant project methodology is still Waterfall. In 2017, most IT operations groups remain focused on delivering reliable, stable, secure services with a minimum of down time. Now, however, approximately 75-80% of the IT spend is committed to keeping the, with the tiny remainder going to innovation and business solutions.
Is this progress? Today, the business is pushing for more creativity, flexibility and innovation from their IT systems. Yet, many IT operations groups - stuck in post-Y2K mode - remain hunkered down and determined to create the most bulletproof environments in history. Meanwhile, the business side has been scrambling to maintain its relevance while competing in our new digital world.
Smartphones and tablets that are always connected. More savvy consumers. Drastically and timeframes. The business now knows what needs to be done and they know how fast they need it done.
The business has started pushing development (dev) and project management (PM) to get with the program. Pushing them hard. With 20 years of data to back it up, dev and PM have to admit that they are broken.
The Waterfall project methodology is simply not working for most development projects. There are better ways to develop software. Crisis averted.
Business, dev and project management are all in sync, oh my! Breaking into their happy dance and all's right with the world! Evolution of IT leaves IT ops behind Whoa! Yes, things are finally good with biz, dev and PM, but nobody told ops. You remember IT ops, the people who actually deploy all these new wondrous Agile-developed business solutions so that actual value can actually be realized by the business. IT operations is still operating under the old rules.
The 2003 rules. Nothing has changed for them. Well, that's not quite true.
They are trying to apply the 2003 rules to a world with smartphones, Wi-Fi and ubiquitous cloud environments. They are still trying to: Lock it down, resist change, make it bulletproof, defeat hackers, safeguard availability, increase reliability and maintain stability and, indirectly, frustrate the heck out of the business and dev and PM as they try to. Ops seem to represent a literal roadblock to the fast lane of Agile deployment. Ops is not a little off- kilter, but rather is suffering from a complete disconnect.
A different definition of value and success. Like landing on a different planet where everything seems upside down.
A Twilight Zone episode where all the beliefs you have valued and cherished are up in smoke. Life, literally, doesn't make sense anymore. Can this chasm be crossed? Can two such diametrically opposed sets of values ever be brought into alignment?
Even more than that: Can two sides that seem to be polar opposites actually exist -no, thrive - under a common set of rules? The popular buzzwords are digital transformation. But haven't we actually been digitally transforming since before the mid 1990s? Yes, so let's call it what it is - a value transformation. (Which means it's a people transformation.) Go to part two of Spalding's essay on the evolution of IT, ',' for a discussion of DevOps' role in delivering today's definition of business value.