Tag Archives: Exaflood

AT&T’s Exaflood Acquisition Good for Mobile Consumers and Internet Growth

AT&T’s announced purchase of T-Mobile is an exaflood acquisition — a response to the overwhelming proliferation of mobile computers and multimedia content and thus network traffic. The iPhone, iPad, and other mobile devices are pushing networks to their limits, and AT&T literally could not build cell sites (and acquire spectrum) fast enough to meet demand for coverage, capacity, and quality. Buying rather than building new capacity improves service today (or nearly today) — not years from now. It’s a home run for the companies — and for consumers.

We’re nearing 300 million mobile subscribers in the U.S., and Strategy Analytics estimates by 2014 we’ll add an additional 60 million connected devices like tablets, kiosks, remote sensors, medical monitors, and cars. All this means more connectivity, more of the time, for more people. Mobile data traffic on AT&T’s network rocketed 8,000% in the last four years. Remember that just a decade ago there was essentially no wireless data traffic. It was all voice traffic. A few rudimentary text applications existed, but not much more. By year-end 2010, AT&T was carrying around 12 petabytes per month of mobile traffic alone. The company expects another 8 to 10-fold rise over the next five years, when its mobile traffic could reach 150 petabytes per month. (We projected this type of growth in a series of reports and articles over the last decade.)

The two companies’ networks and businesses are so complementary that AT&T thinks it can achieve $40 billion in cost savings. That’s more than the $39-billion deal price. Those huge efficiencies should help keep prices low in a market that already boasts the lowest prices in the world (just $0.04 per voice minute versus, say, $0.16 in Europe).

But those who focus only on the price of existing products (like voice minutes) and traditional metrics of “competition,” like how many national service providers there are, will miss the boat. Pushing voice prices down marginally from already low levels is not the paramount objective. Building fourth generation mobile multimedia networks is. Some wonder whether “consolidation of power could eventually lead to higher prices than consumers would otherwise see.” But “otherwise” assumes a future that isn’t going to happen. T-Mobile doesn’t have the spectrum or financial wherewithal to deploy a full 4G network. So the 4G networks of AT&T, Verizon, and Sprint (in addition to Clearwire and LightSquared) would have been competing against the 3G network of T-Mobile. A 3G network can’t compete on price with a 4G network because it can’t offer the same product. In many markets, inferior products can act as partial substitutes for more costly superior products. But in the digital world, next gen products are so much better and cheaper than the previous versions that older products quickly get left behind. Could T-Mobile have milked its 3G network serving mostly voice customers at bargain basement prices? Perhaps. But we already have a number of low-cost, bare-bones mobile voice providers.

The usual worries from the usual suspects in these merger battles go like this: First, assume a perfect market where all products are commodities, capacity is unlimited yet technology doesn’t change, and competitors are many. Then assume a drastic reduction in the number of competitors with no prospect of new market entrants. Then warn that prices could spike. It’s a story that may resemble some world, but not the one in which we live.

The merger’s boost to cell-site density is hugely important and should not be overlooked. Yes, we will simultaneously be deploying lots of new Wi-Fi nodes and femtocells (little mobile nodes in offices and homes), which help achieve greater coverage and capacity, but we still need more macrocells. AT&T’s acquisition will boost its total number of cell sites by 30%. In major markets like New York, San Francisco, and Chicago, the number of AT&T cell sites will grow by 25%-45%. In many areas, total capacity should double.

It’s not easy to build cell sites. You’ve got to find good locations, get local government approvals, acquire (or lease) the sites, plan the network, build the tower and network base station, connect it to your long-haul network with fiber-optic lines, and of course pay for it. In the last 20 years, the number of U.S. cell sites has grown from 5,000 to more than 250,000, but we still don’t have nearly enough. CEO Randall Stephenson says the T-Mobile purchase will achieve almost immediately a network expansion that would have taken five years through AT&T’s existing organic growth plan. Because of the nature of mobile traffic — i.e., it’s mobile and bandwidth is shared — the combination of the two networks should yield a more-than-linear increase in quality improvements. The increased cell-site density will give traffic planners much more flexibility to deliver high-capacity services than if the two companies operated separately.

The U.S. today has the most competitive mobile market in the world (second, perhaps, only to tiny Hong Kong). Yes, it’s true, even after the merger, the U.S. will still have a more “competitive” market than most. But “competition” is often not the most — or even a very — important metric in these fast moving markets. In periods of undershoot, where a technology is not good enough to meet demand on quantity or quality, you often need integration to optimize the interfaces and the overall experience, a la the hand-in-glove paring of the iPhone’s hardware, software, and network. Streaming a video to a tiny piece of plastic in your pocket moving at 60 miles per hour — with thousands of other devices competing for the same bandwidth — is not a commodity service. It’s very difficult. It requires millions of things across the network to go just right. These services often take heroic efforts and huge sums of capital just to make the systems work at all.

Over time technologies overshoot, markets modularize, and small price differences matter more. Products that seem inferior but which are “good enough” then begin to disrupt state-of-the art offerings. This was what happened to the voice minute market over the last 20 years. Voice-over-IP, which initially was just “good enough,” made voice into a commodity. Competition played a big part, though Moore’s law was the chief driver of falling prices. Now that voice is close to free (though still not good enough on many mobile links) and data is king, we see the need for more integration to meet the new challenges of the multimedia exaflood. It’s a never ending, dynamic cycle. (For much more on this view of technology markets, see Harvard Business School’s Clayton Christensen).

The merger will have its critics, but it seriously accelerates the coming of fourth generation mobile networks and the spread of broadband across America.

— Bret Swanson

Data roaming mischief . . . Another pebble in the digital river?

Mobile communications is among the healthiest of U.S. industries. Through a time of economic peril and now merely uncertainty, mobile innovation hasn’t wavered. It’s been a too-rare bright spot. Huge amounts of infrastructure investment, wildly proliferating software apps, too many devices to count. If anything, the industry is moving so fast on so many fronts that we risk not keeping up with needed capacity.

Mobile, perhaps not coincidentally, has also been historically a quite lightly regulated industry. But emerging is a sort of slow boil of small but many rules, or proposed rules, that could threaten the sector’s success. I’m thinking of the “bill shock” proceeding, in which the FCC is looking at billing practices and various “remedies.” And the failure to settle the D block public safety spectrum issue in a timely manner. And now we have a group of  rural mobile providers who want the FCC to set prices in the data roaming market.

You remember that “roaming” is when service provider A pays provider B for access to B’s network so that A’s customers can get service when they are outside A’s service area, or where it has capacity constraints, or for redundancy. These roaming agreements are numerous and have always been privately negotiated. The system works fine.

But now a group of provider A’s, who may not want to build large amounts of new network capacity to meet rising demand for mobile data, like video, Facebook, Twitter, and app downloads, etc., want the FCC to mandate access to B’s networks at regulated prices. And in this case, the B’s have spent many tens of billions of dollars in spectrum and network equipment to provide fast data services, though even these investments can barely keep up with blazing demand.

The FCC has never regulated mobile phone rates, let alone data rates, let alone data roaming rates. And of course mobile voice and data rates have been dropping like rocks. These few rural providers are asking the FCC to step in where it hasn’t before. They are asking the FCC to impose old-time common carrier regulation in a modern competitive market – one in which the FCC has no authority to impose common carrier rules and prices.

In the chart above, we see U.S. info-tech investment in 2010 approached $500 billion. Communications equipment and structures (like cell phone towers) surpassed $105 billion. The fourth generation of mobile networks is just in its infancy. We will need to invest many tens of billions of dollars each year for the foreseeable future both to drive and accommodate Internet innovation, which spreads productivity enhancements and wealth across every sector in the economy.

It is perhaps not surprising that a small number of service providers who don’t invest as much in high-capacity networks might wish to gain artificially cheap access to the networks of the companies who invest tens of billions of dollars per year in their mobile networks alone. Who doesn’t like lower input prices? Who doesn’t like his competitors to do the heavy lifting and surf in his wake? But the also not surprising result of such a policy could be to reduce the amount that everyone invests in new networks. And this is simply an outcome the technology industry, and the entire country, cannot afford. The FCC itself has said that “broadband is the great infrastructure challenge of the early 21st century.”

Economist Michael Mandel has offered a useful analogy:

new regulations [are] like  tossing small pebbles into a stream. Each pebble by itself would have very little effect on the flow of the stream. But throw in enough small pebbles and you can make a very effective dam.

Why does this happen? The answer is that each pebble by itself is harmless. But each pebble, by diverting the water into an ever-smaller area,  creates a ‘negative externality’ that creates more turbulence and slows the water flow.

Similarly, apparently harmless regulations can create negative externalities that add up over time, by forcing companies to spending  time and energy meeting the new requirements. That reduces business flexibility and hurts innovation and growth.

It may be true that none of the proposed new rules for wireless could alone bring down the sector. But keep piling them up, and you can dangerously slow an important economic juggernaut. Price controls for data roaming are a terrible idea.

World Catches On to the Exaflood

Researchers Martin Hilbert and Priscila Lopez add to the growing literature on the data explosion (what we long ago termed the “exaflood”) with a study of analog and digital information storage, transmission, and computation from 1986 through 2007. They found in 2007 globally we were able to store 290 exabytes, communicate almost 2 zettabytes, and compute around 6.4 exa-instructions per second (EIPS?) on general purpose computers. The numbers have gotten much, much larger since then. Here’s the Science paper (subscription), which appears along side an entire special issue, “Dealing With Data,” and here’s a graphic from the Washington Post:

(Thanks to @AdamThierer for flagging the WashPost article.)

Mobile traffic grew 159% in 2010 . . . Tablets giving big boost

Among other findings in the latest version of Cisco’s always useful Internet traffic updates:

  • Mobile data traffic was even higher in 2010 than Cisco had projected in last year’s report. Actual growth was 159% (2.6x) versus projected growth of 149% (2.5x).
  • By 2015, we should see one mobile device per capita . . . worldwide. That means around 7.1 billion mobile devices compared to 7.2 billion people.
  • Mobile tablets (e.g., iPads) are likely to generate as much data traffic in 2015 as all mobile devices worldwide did in 2010.
  • Mobile traffic should grow at an annual compound rate of 92% through 2015. That would mean 26-fold growth between 2010 and 2015.

Did Phil and Tiger lead to Akamai’s record 3.45 terabit day?

Akamai announced a record peak in traffic volume on its content delivery network on April 9.

In addition to reaching a milestone for peak traffic served this past Friday, the Akamai network also hit a new peak during the same day for video streaming, as well as a near high for total requests served.

  • With online interest in major sporting events – including professional golf and baseball – helping to drive the surge in demand, Akamai delivered its largest ever traffic for high definition video streaming.
  • Over the course of the day, Akamai logged over 500 billion requests for content, a sum equal to serving content to every human once every 20 minutes
  • At peak, Akamai supported over 12 million requests per second – a rate roughly equivalent to serving content to the entire population of the United States every 30 seconds.
The first question that popped into my mind: Was this the work of Phil, Freddie, Tiger, and Tom? Last Friday I had noted to several friends the spectacular website of The Masters golf tournament and the high quality of its live action video streams. Looks as if lots of others noticed the compelling online video experience as well.

Exa Metrics

Here’s a new exaflood metric for you — tweets per second.

From the Twitter blog:

Folks were tweeting 5,000 times a day in 2007. By 2008, that number was 300,000, and by 2009 it had grown to 2.5 million per day. Tweets grew 1,400% last year to 35 million per day. Today, we are seeing 50 million tweets per day—that’s an average of 600 tweets per second. (Yes, we have TPS reports.)

Exa News

A number of interesting new articles and forums deal with our exaflood theme of the past few years.

“Striving to Map the Shape-Shifting Net” – by John Markoff – The New York Times – March 2, 2010

“Data, data, everywhere”The Economist – Special Report on Managing Information – February 25, 2010

“Managing the Exaflood” – American Association for the Advancement of Science – February 19, 2010

“Professors Find Ways to Keep Heads Above ‘Exaflood’ of Data” – Wired Campus – The Chronicle of Higher Education – February 24, 2010

Mobile traffic to grow 39x by 2014

Cisco’s latest Visual Networking Index, this one focusing mobile data traffic, projects 108% compound growth through 2014.


The Wall Street Journal‘s Digits blog asks, “Could Verizon Handle Apple Tablet Traffic?”

The tablet’s little brother, the iPhone, has already shown how an explosion in data usage can overload a network, in this case AT&T’s. And the iPhone is hardly the kind of data guzzler the tablet is widely expected to be. After all, it’s one thing to squint at movies on a 3.5-inch screen and quite another to watch them in relatively cinematic 10 inches.

“Clearly this is an issue that needs to be fixed,” says Broadpoint Amtech analyst Brian Marshall. “It can grind the networks to a halt.”

The Digital Decade

A bunch of good metrics on the decade that was from Oliver Chiang. Here are a few:

–Number of e-mails sent per day in 2000: 12 billion

–Number of e-mails sent per day in 2009: 247 billion

–Revenues from mobile data services in the first half of 2000: $105 million

–Revenues from mobile data services in the first half of 2009: $19.5 billion

–Number of text messages sent in the U.S. per day in June 2000: 400,000

–Number of text messages sent in the U.S. per day in June 2009: 4.5 billion

–Number of pages indexed by Google in 2000: 1 billion

–Number of pages indexed by Google in 2008: 1 trillion

–Amount of hard-disk space $300 could buy in 2000: 20 to 30 gigabytes

–Amount of hard-disk space $300 could buy in 2009: 2,000 gigabytes (2 terabytes)

Finally . . . another HMI? study!

I loved pouring through Berkeley’s 2000 and 2003 studies estimating answers to a very big question –- How Much Information? How much digital information do we create and consume. Always lots of useful — and trivial — stuff in those reports. But where has HMI? been these last few years? Finally, UC-San Diego has picked up the torch and run with a new version, HMI? 2009.

So, you are asking, HMI? The UCSD team estimates that in 2008 outside of the workplace Americans consumed 3.6 zettabytes of information. That’s 3.6 x 10^21 bytes, or 3,600 billion billion.

“HD”Tube: YouTube moves toward 1080p

YouTube is moving toward a 1080p Hi Def video capability, just as we long-predicted.

This video may be “1080p,” but the frame-rate is slow, and the video motion is thus not very smooth. George Ou estimates the bit-rate at 3.7 Mbps, which is not enough for real full-motion HD. But we’re moving quickly in that direction.

Two-year study finds fast changing Web

See our brief review of Arbor Networks’ new two-year study where they captured and analyzed 264 exabytes of Internet traffic. Highlights:

  • Internet traffic growing at least 45% annually.
  • Web video jumped to 52% of all Internet traffic from 42%.
  • P2P, although still substantial, dropped more than any other application.
  • Google, between 2007 and 2009, jumped from outside the top-ten global ISPs by traffic volume to the number 3 spot.
  • Comcast jumped from outside the top-ten to number 6.
  • Content delivery networks (CDNs) are now responsible for around 10% of global Internet traffic.
  • This fast-changing ecosystem is not amenable to rigid rules imposed from a central authority, as would be the case under “net neutrality” regulation.

Arbor’s new Net traffic report: “This is just the beginning…”

See this comprehensive new Web traffic study from Arbor Networks — “the largest study of global Internet traffic since the start of the commercial Internet.” 


Internet is at an inflection point

Transition from focus on connectivity to content
Old global Internet economic models are evolving
New entrants are reshaping definition / value of connectivity

New technologies are reshaping definition of network
“Web” / Desktop Applications, Cloud computing, CDN

Changes mean significant new commercial, security and engineering challenges

This is just the beginning…

These conclusions and the data Arbor tracked and reported largely followed our findings, projections, and predictions from two years ago:

And an update from this spring:

Also see our analysis from last winter highlighting the evolution of content delivery networks — what my colleague George Gilder dubbed “storewidth” back in 1999 — and which Arbor now says is the fastest growing source/transmitter of Net traffic.

Did Cisco just blow $2.9 billion?

Cisco better hope wireless “net neutrality” does not happen. It just bought a company called Starent that helps wireless carriers manage the mobile exaflood.

See this partial description of Starent’s top product:

Intelligence at Work

Key to creating and delivering differentiat ed ser vices—and meeting subscriber demand—is the ST40’s ability to recognize different traffic flows, which allows it to shape and manage bandwidth, while interacting with applications to a very fine degree. The system does this through its session intelligence that utilizes deep packet inspection (DPI) technology, ser vice steering, and intelligent traffic control to dynamically monitor and control sessions on a per-subscriber/per-flow basis.

The ST40’s interaction with and understanding of key elements within the multimedia call—devices, applications, transport mechanisms, policies—and assists in the ser vice creation process by:

Providing a greater degree of information granularity and flexibility for billing, network planning, and usage trend analysis

Sharing information with external application ser vers that perform value-added processing

Exploiting user-specific attributes to launch unique applications on a per-subscriber basis

Extending mobility management information to non-mobility aware applications

Enabling policy, charging, and Quality of Ser vice (QoS) features

Traffic management. QoS. Deep Packet Inspection. Per service billing. Special features and products. Many of these technologies and features could be outlawed or curtailed under net neutrality. And the whole booming wireless arena could suffer.


YouTube says it now serves up well over a billion videos a day — far more than previously thought.

An Exa-Prize for “Masters of Light”

Holy Swedish silica/on. It’s an exa-prize!

Calling them “Masters of Light,” the Royal Swedish Academy awarded the 2009 Nobel Prize in Physics to Charles Kao, for discoveries central to the development of optical fiber, and to Willard Boyle and George Smith of Bell Labs, for the invention of the charge-coupled device (CCD) digital imager.

Perhaps more than any two discoveries, these technologies are responsible for our current era of dramatically expanding cultural content and commercial opportunities across the Internet. I call this torrent of largely visual data gushing around the Web the “exaflood.” Exa means 1018, and today monthly Internet traffic in the U.S. tops two exabytes. For all of 2009, global Internet traffic should reach 100 exabytes, equal to the contents of around 5,000,000 Libraries of Congress. By 2015, the U.S. might transmit 1,000 exabytes, the equivalent of two Libraries of Congress every second for the entire year.

Almost all this content is transmitted via fiber optics, where laser light pulsing billions of times a second carries information thousands of miles through astoundingly pure glass (silica). And much of this content is created using CCD imagers, the silicon microchips that turn photons into electrons in your digital cameras, camcorders, mobile phones, and medical devices. The basic science of the breakthroughs involves mastering the delicate but powerful reflective, refractive, and quantum photoelectric properties of both light and one of the world’s simplest and most abundant materials — sand. Also known in different forms as silica and silicon.

The innovations derived from Kao, Boyle, and Smith’s discoveries will continue cascading through global society for decades to come.

Leviathan Spam

Leviathan Spam

Send the bits with lasers and chips
See the bytes with LED lights

Wireless, optical, bandwidth boom
A flood of info, a global zoom

Now comes Lessig
Now comes Wu
To tell us what we cannot do

The Net, they say,
Is under attack
Before we can’t turn back

They know best
These coder kings
So they prohibit a billion things

What is on their list of don’ts?
Most everything we need the most

To make the Web work
We parse and label
We tag the bits to keep the Net stable

The cloud is not magic
It’s routers and switches
It takes a machine to move exadigits

Now Lessig tells us to route is illegal
To manage Net traffic, Wu’s ultimate evil (more…)

A New Leash on the Net?

Today, FCC chairman Julius Genachowski proposed new regulations on communications networks. We were among the very first opponents of these so-called “net neutrality” rules when they were first proposed in concept back in 2004. Here are a number of our relevant articles over the past few years:

Can Microsoft Grasp the Internet Cloud?

See my new Forbes.com commentary on the Microsoft-Yahoo search partnership:

Ballmer appears now to get it. “The more searches, the more you learn,” he says. “Scale drives knowledge, which can turn around and drive innovation and relevance.”

Microsoft decided in 2008 to build 20 new data centers at a cost of $1 billion each. This was a dramatic commitment to the cloud. Conceived by Bill Gates’s successor, Ray Ozzie, the global platform would serve up a new generation of Web-based Office applications dubbed Azure. It would connect video gamers on its Xbox Live network. And it would host Microsoft’s Hotmail and search applications.

The new Bing search engine earned quick acclaim for relevant searches and better-than-Google pre-packaged details about popular health, transportation, location and news items. But with just 8.4% of the market, Microsoft’s $20 billion infrastructure commitment would be massively underutilized. Meanwhile, Yahoo, which still leads in news, sports and finance content, could not remotely afford to build a similar new search infrastructure to compete with Google and Microsoft. Thus, the combination. Yahoo and Microsoft can share Ballmer’s new global infrastructure.

Next Page »