Gordon Moore, the Intel founder who foresaw the relentless advance of technology
A look at the nonstop acceleration of Internet technology
COMMENTARY | September 20, 2009
Will there ever be a pause so that the media can adapt to enormous technological change? Yes, there’s a possibility, writes John Palfreman. Maybe in 2019. (This is one of several articles from the Fall 2009 issue of Nieman Reports dealing with journalists adapting to change in how they work and what they produce.)
By John Palfreman
Seems simple enough: Click on a link to read [a Nieman Reports print] article online. But behind the scenes, matters are more complicated. Your click doesn’t connect to the Nieman Foundation’s offices in Cambridge. In fact, the computer servers with this and other Nieman Reports articles aren’t even in Massachusetts; they are in Arizona. The click breaks up tiny data packets that then travel across the Internet by different routes—down copper wires, along fibers of optically pure glass, and through air—to reach your computer in San Francisco, New York or London, where the packets are reassembled and displayed as text.
Dozens of machines are involved in every online mouse click, executed so quickly that you’re completely unaware of them. Perhaps you think you don’t care, but I hope my words will convince you otherwise.
Ten years ago, Nieman Reports was essentially a print-only magazine. The Internet was seen as just one storm cloud among many that threatened journalism’s future. In a special 1999 edition of Nieman Reports on the future of journalism, “The Business of News, The News About Business,” there is only one reference to the Internet. Lou Ureneck, now chairman of the journalism department at Boston University, perceptively warned that the Internet “threatens the pot of gold at the back of newspapers—the classifieds.” He was right.
The Internet has not only disrupted the business and practice of journalism, it has changed our world in fundamental ways. Today, American business and government conduct virtually all of their transactions via the Web. According to the Pew Research Center’s Internet & American Life Project, 74 percent of Americans use the Internet, and use it to accomplish a growing list of tasks. Along the way we interact with numerous dot-com enterprises: from Web mail services like Gmail, Hotmail or Yahoo!, to data storage services like Box.net, IDrive, iDisk and Mozy. We upload pictures to Flickr, SmugMug and Photobucket, edit videos with Avid, Final Cut Pro and JayCut, upload our creations to YouTube and Vimeo, buy and sell items on Craigslist and eBay, exchange multimedia messages through MySpace and Facebook, talk to each other on Twitter, compose documents with Google Docs, crunch spreadsheets with Zoho, aggregate news with Bloglines and Google Reader, and even manage projects in Basecamp. Most of these companies didn’t exist in 1999. Google, founded in 1998, has become one of the most powerful and influential corporations on the globe.
How has so much change happened so rapidly? Part of the answer can be found in the workings of the underlying digital technology that so few of us bother to understand. Its ascent is unlike any in history. The IBM PC on Nieman Reports’ Editor Melissa Ludtke’s desk today is 30 or 40 times more powerful than the Gateway computer she had in 1999. This spectacular improvement conforms to Moore’s Law, named after Intel founder Gordon Moore, who sagely predicted in 1965 that roughly every two years, electronic components would get twice as small, twice as fast, and consume half as much electricity. He was right. If Ludtke’s car had realized the same efficiency gains, her car that got 20 miles per gallon in 1999 would be getting 640 miles for every gallon today.
The relentless exponential improvement of digital technology is historically unique. Most technologies develop bounded by physical constraints. There are limits, for example, to how fast planes will travel (without burning up), or how high buildings can be built (without falling down), and these physical boundaries act as a brake on progress, brakes that give human beings a chance to adapt. But digital technology, which involves bits of information, not lumps of matter, is
different. It just keeps getting better, faster and cheaper.
It’s exciting but very disruptive. It enables new companies like Google to emerge, grow rapidly and change the world, with little time for long established human institutions like journalism to adapt. It follows that unless digital technology runs into some kind of major obstacle, the world of 2019—by which time Ludtke’s “computer” will be some 1,000 times more powerful than her 1999 machine—will again be turned upside down, raising new transformative opportunities by enabling new digital ventures to share the stage with the powerhouses like Google and Amazon—or perhaps to push them aside.
What happens with journalism? Newspaper editors, producers, journalism professors, and even new media gurus, given their record, aren’t likely to reliably anticipate journalism’s trajectory, so let’s look instead at where the technological infrastructure of the Internet is headed.
An interesting and potentially disruptive trend is “cloud computing.” A decade or so ago, companies needed to set up dedicated IT departments with their own data storage. Not any more. Today, Google, Microsoft, Yahoo!, Amazon and others offer an alternative: They’re building and operating vast Internet data centers offering data processing as a utility to anyone. In “cloud computing” the data, processing power, and software are stored in the Internet cloud rather than on the user’s computer.
The physical demands of running cloud computing are significant: They include the construction of a series of gigantic air-conditioned Internet temples—“server farms” that house racks and racks of computer servers. And powering these farms puts environmental concerns into the mix of this technological advance. Microsoft opened a server farm in Quincy, Washington, in 2007; it is larger than 10 football fields. Google’s server farm in The Dalles, Oregon, is almost as big. Apple is building one in North Carolina. According to McKinsey & Company, Internet data centers house around 44 million computers—machines that enable users to rent cars, buy books, communicate via social media with friends, and upload videos to Facebook or YouTube. On
these 44 million machines everything from Wikipedia entries to YouTube videos is physically stored, and it’s where Facebook’s 250 million users keep more than 15 billion photos of themselves and their friends.
Almost everything needed to run a business can be outsourced to the cloud, where, the argument goes, professional data processing companies can usually do it better and more cheaply. If cloud computing takes off, changes in the next decade could eclipse those of the last in making it easy, in principle, for a tiny operation to use the same advanced IT as a large company. New or expanding companies won’t need to make a huge capital investment in information technology when they can buy processing by the bit, scaling up slowly or, if they need to, rapidly.
Take, for example, the New York based new media company, Animoto, a service that turns customer supplied images and music into Web-based video presentations. In 2008, Animoto found that demand was skyrocketing; reportedly 750,000 people signed up in a three-day period. Instead of buying new servers, Animoto contracted with Amazon’s new computing service, Elastic Compute Cloud (EC2), to add capacity for about 10 cents per server per hour, absorbing the huge spike in demand. When demand dropped, Animoto was able to scale down easily.
If cloud computing is the next big thing, then why isn’t the news media sticking its head in the cloud? So far there are only a few examples, including these:
• In late 2007, The New York Times decided to make its archive of back issues—11 million articles covering the period 1851-1922—more available to users. Rather than loading the articles onto their servers, the Times outsourced the job; now this archive is stored on Amazon servers somewhere in the United States.
• Telegraph Media Group (TMG) in the United Kingdom, publisher of The Daily Telegraph and The Sunday Telegraph, has gone further and made arrangements with several cloud providers. Google provides TMG’s 1,400 employees with the Google Apps platform, a suite of communication and document applications. TMG also outsources customer management activities such as subscription services and advertising sales to a cloud provider called Salesforce.com.
Nieman Reports, like others, outsources to distant computer servers, as does the Nieman Journalism Lab, which uses a provider in Pittsburgh. With big potential cost savings, this option is one news organizations will likely consider. But doing this is not problem free: There are a series of environmental, financial, security, legal and privacy problems that will need to be resolved along the way.
Energy and environmental concerns: A lot of energy is required to run the Internet cloud. According to Stanford University’s Jonathan Koomey, Internet data centers use about two percent of our nation’s electricity, and usage is increasing at about 15 percent a year. Growth in Internet use is thus overwhelming the efficiency gains of Moore’s Law and generating a significant and growing carbon footprint.
Financial: Rising energy and environmental costs also reinforce worries that some dot-coms are not properly monetized. According to a report by Credit Suisse, YouTube is losing money for its owner Google at a rapid pace—roughly $470 million in 2009 or more than a dollar for every YouTube click.
Security: Servers holding such data could experience power outages or get attacked by hackers. Or the cloud provider could go bankrupt. Already there have been a few embarrassing incidents: Google Docs users were shut out of their online word processor documents for about an hour on July 8, 2008, and Amazon customers (including The New York Times) lost access to data for a few hours on July 20, 2008 following a power outage.
Privacy: Lawyers have also raised the possibility that if an organization, such as a newspaper or university, stores its records online on a third party’s server (e-mails, for example) those documents might not have the same Fourth Amendment protections from unreasonable government search and seizure as data stored on a personal computer.
While troubling, the odds are that such knotty issues can be worked out and digital technology can be expected to continue its relentless and disruptive advance. This raises a broader question: Will there ever be a pause to give us time to adapt?
Perhaps, around 2019. That’s about when Gordon Moore thinks his law might fail. The transistors, the switches that are the basis of modern computer hardware, can’t in theory keep on getting smaller indefinitely. When the transistor’s “gate,” which controls the flow of electrons, becomes too small—fewer than five nanometers (five billionths of a meter)—then the transistor may no longer function as an effective switch and the game will be up. Until, of course, scientists and engineers invent something new.
So by 2019, or thereabouts, when Nieman Reports reflects on the state of the news media, we might have a chance to take a breath and consider the distance we’ve traveled and ponder the road ahead. Until then, expect new cloud computing dot-coms to further change our media landscape. The journalist’s hope is that among the next decade’s big winners will be some dot-coms (or dot-orgs) that have pioneered sustainable business models for reporting and communicating news. _
Jon Palfreman, a 2006 Nieman Fellow, is KEZI Distinguished Professor of Broadcast Journalism at the University of Oregon. A veteran of both U.K. and U.S. television, he has made more than 40 BBC and PBS one-hour documentaries including the Peabody Award-winning series the “Machine That Changed the World,” the Emmy Award-winning NOVA “Siamese Twins,” and the Alfred I. duPont-Columbia University Silver Batonwinner, “Harvest of Fear.”