Technology predictions are inherently hard. Not only is the underlying science often unpredictable, but human reactions frequently confound even experts. As Yogi Berra said, "If people don't want to come out to the ballpark, how are you going to stop them?" And very often people decide not to come to the ballpark that the technologists build, or at least not as quickly as the technologists hope and expect. One striking example is that of video. TiVo and Netflix are each about a decade old. The overwhelming consensus is that DVRs (digital video recorders), which TiVo pioneered, are the future, and consumers who adopt them rave about the service. A similarly overwhelming consensus is that physical distribution of DVDs is a doomed business, and few people hold out much hope for Netflix to find its way to a sustained presence in an online environment. Yet Netflix is nicely profitable, while TiVo has been struggling from the beginning. So one has to be cautious about relying on any predictions, no matter how expert and renowned the source. It is essential to collect real data to get a sense of how the interplay of technology, economics, and human choices plays out.
Even though precise prediction is impossible, one can make educated guesses, especially if one understands the basic underlying technology trends, and if one pays attention to what happens. Internet traffic growth patterns have changed, but usually not on short time scales. This page updates the discussion on sources of Internet traffic growth from CO2002b, Odl2003c.
|U.S. Internet traffic, estimates at end of each year|
|1997||2.5 - 4|
|1998||5 - 8|
|1999||10 - 16|
|2000||20 - 35|
|2001||40 - 70|
|2002||80 - 140|
|2003||130 - 210|
|2004||200 - 300|
|2005||300 - 500|
|2006||450 - 800|
|2007||750 - 1,250|
|2008||1,200 - 1,800|
There was a very regular doubling of traffic in the early 1990s, which was well and reliably documented, since most of the traffic was on the NSF backbone, which had careful measurements that were publicly available. Then (and from 1995 on everything is an estimate, since no verified data is available CO1998, Odl2003c) there was a period of about 10x growth in each of 1995 and 1996, as graphics-rich Web traffic replaced text-oriented material, and millions of users of proprietary online services came unto the Internet. (Hence that the WorldCom/UUNET claims for "doubling every 100 days" ODell2000, which were ridiculous and transparently false when made in 2000, and should have been greeted with hoots of derision and indignation, instead of applause, do reflect a much earlier period when such rates actually did hold.) But then, starting in 1997, growth subsided towards an approximate doubling each year (meaning growth rate of 70 to 150% per year CO1998), and continued at that rate into the early years of this decade. (The 4x annual growth claimed to hold and predicted for the future by some of the other speakers at the early 2000 symposium ODell2000, as well as in late 2001 by Larry Roberts and Caspian Networks Odl2001, appears to have been unrealistic.) But since that time, as is documented in this study, growth rates of Internet traffic have been declining even further, generally to the 50% a year range. (And thus the "Moore's law for data traffic," predicting continuing doubling each year CO1998, CO2002a, and CO2002b has been refuted.) Still, it is possible that much higher growth rates might occur, including (at least for short periods) the 4x to 6x annual growth predicted by John Chambers of Cisco. There are potential sources that could drive such a surge, and the question is whether it will occur.
Internet traffic is just one type of data traffic, although the most prominent one. What stimulates its growth is a complex set of feedback loops operating on different time scales, involving human adoption of new services and improvements in processing, storage, and transmission technologies.
The Information and Communication Technologies (ICT) revolution has been driven by a variety of Moore's laws. The original Moore's Law, formulated by Gordon Moore in 1965, referred just to the number of transistors that could be placed on a single chip, and predicted that this number would continue doubling every year for the next few years. A decade later, based on more data, Moore revised his "law" to predict a doubling of transistors on a chip every 18 months, a prediction that held true for an astonishing quarter of a century. Recently, however, the doubling of transistor counts has slowed down to about once every two years ITRS2006.
The term Moore's Law has come to be used generically for technological advances that occur at predictable and exponential (in the precise mathematical sense of the word, meaning increases by a fixed factor each year) rates. And indeed we have many examples of such Moore's laws, except that in most cases they are not as steady in their growth rates as the original Moore's Law for transistor counts, and often they are far slower. (For an interesting collection of data points from various information technologies, see JCMT.) For example, battery capacities are increasing very slowly, which limits what can be done with wireless systems. And screen resolutions are increasing faster than battery storage capacities, but at far more modest rates than transistor densities.
The three main drivers of the ICT revolution are transmission capacity, processing power, and storage. Semiconductor (DRAM, SRAM, ....) storage densities used to grow at the old Moore's Law pace of about 60% gain per year, and are now at 40-50%, with the pace of advance having slowed down to a doubling every two years. Hard disk densities (and volumes of hard disk storage shipped, see Table 8.1 in CO2002a) used to grow at about 60% per year in the mid-1990s, and then accelerated to a doubling each year, see Groch2003. However, recently there has been a dramatic slowdown in progress in magnetic recording (as well as a cutback in spending on memories during the 2001-2003 tech crash), so that the IDC projections from 2000 reflected in Table 8.1 in CO2002a were not fulfilled. The latest projections EMCIDC2007 are for total digital storage capacity growth of only 35% per year for the rest of this decade. This conflicts with predictions from many sources (academic, medical sector, government, and industry) of storage needs doubling each year for the foreseeable future, and how this conflict gets resolved is an interesting and vital question for many technologies and industries.
Growth in processing power has generally been regarded as following the original Moore's Law prediction of 60% gain every year. For the most powerful computers in the world, those on the Top 500 list Top500, the combined processing power of those 500 machines, at least as measured by the benchmark used there (a benchmark that is unrealistic for most situations), has actually been growing at close to 100% per year. These growth rates were accomplished through combinations of increased clock speeds, elaborate cache hierarchies, and use of more chips. But now many of those avenues for improvement are losing their power, and there is uncertainty about the rate at which effective processing capacity will be increasing. Power, in particular, is an increasingly serious limitation. The main trend now is towards putting more cores (processing elements) on a single chip. This promises huge improvements in pure number crunching power, but there are serious questions as to what this will mean, since there is a lack of knowledge of how to program these multi-core machines, and the communication bandwidth off the chip will not be growing proportionately to the processing power
For transmission, growth trends are even less clear. The telecommunications system is very complex, with a variety of interconnected networks and a heterogenous mix of technologies. However, it appears that growth rates of 50% per year can be sustained without substantial increases in spending.
The main conclusion from the above discussion is that the progress over the next few years of technologies related to communications is harder to predict than over the past decade, say. An important fact to keep in mind, though, is that there are huge potential sources of Internet traffic that already exist. As was already discussed in CO2002b, for example, the volume of hard disk storage has traditionally far outstripped the transmission capacity of long distance networks. The recent study EMCIDC2007 estimates that in 2006, there were 185,000 PB of digital storage capacity in the world. Combined with our estimate of around 2,500 PB of Internet traffic per month in 2006, this implies it would take six years to transmit all the bits on all those storage systems through the current networks at current rates. Clearly most of those bits are redundant or duplicative, but this is a nice thought exercise that demonstrates that just a slight change in the velocity with which information circulates can have a large impact on Internet traffic. (And it is worth keeping in mind that, as has been well known for a long time, and will continue to be true for many years, the least expensive method for shipping large quantities of data is by physical shipment of optical or magnetic media Odl2003b.)
Furthermore, there is a huge volume of broadcast video (over-the-air and through cable) that is far larger than today's Internet traffic. The key question is how quickly this video (as well as new types of video, such as YouTube) will migrate to the Internet Odl2003a. Let us recall that a decade ago, when the first VoIP (voice over IP) systems were fielded, there were wide concerns that they would overwhelm the Internet. And indeed, the volumes of voice traffic in those days were far larger than the Internet could handle CO1998, and so a rapid migration of voice to VoIP would have crashed the network. But VoIP took about a decade to take over, and by that time Internet traffic dwarfed it.
The final conclusion from these discussions is that although there are great uncertainties about technology developments over the next few years, Internet traffic growth is likely to be determined more by business models adopted by service providers (and content producers), services that are offered to users, services that users create on their own, and user acceptance of new applications. None of these can be predicted a priori, so careful monitoring of Internet traffic appears called for.