Archive for the ‘Broadband’ Category

How can U.S. broadband lag if it generates 2-3 times the traffic of other nations?

Monday, November 24th, 2014

Is the U.S. broadband market healthy or not? This question is central to the efforts to change the way we regulate the Internet. In a short new paper from the American Enterprise Institute, we look at a simple way to gauge whether the U.S. has in fact fallen behind other nations in coverage, speed, and price . . . and whether consumers enjoy access to content. Here’s a summary:

  • Internet traffic volume is an important indicator of broadband health, as it encapsulates and distills the most important broadband factors, such as access, coverage, speed, price, and content availability.
  • US Internet traffic — a measure of the nation’s “digital output” — is two to three times higher than most advanced nations, and the United States generates more Internet traffic per capita and per Internet user than any major nation except for South Korea.
  • The US model of broadband investment and innovation — which operates in an environment that is largely free from government interference — has been a dramatic success.
  • Overturning this successful policy by imposing heavy regulation on the Internet puts one of America’s most vital industries at risk.

Twitch Proves the Net Is Working

Wednesday, October 1st, 2014

Below find our Reply Comments in the Federal Communications Commission’s Open Internet proceeding:

September 15, 2014

Twitch Proves the Net Is Working

On August 25, 2014, Amazon announced its acquisition of Twitch for around $1 billion. Twitch  (twitch.tv) is a young but very large website that streams video games and the gamers who play them. The rise of Twitch demonstrates the Net is working and, we believe, also deals a severe blow to a central theory of the Order and NPRM.

The NPRM repeats the theory of the 2010 Open Internet Order that “providers of broadband Internet access service had multiple incentives to limit Internet openness.” The theory advances a concern that small start-up content providers might be discouraged or blocked from opportunities to grow. Neither the Order nor the current NPRM considers or even acknowledges evidence or arguments to the contrary — that broadband service providers (BSPs) may have substantial incentives to promote Internet openness. Nevertheless, the Commission now helpfully seeks comment “to update the record to reflect marketplace, technical, and other changes since the 2010 Open Internet Order was adopted that may have either exacerbated or mitigated broadband providers’ incentives and ability to limit Internet openness. We seek general comment on the Commission’s approach to analyzing broadband providers’ incentives and ability to engage in practices that would limit the open Internet.”

The continued growth of the Internet, and the general health of the U.S. Web, content, app, device, and Internet services markets — all occurring in the absence of Net Neutrality regulation — more than mitigate the Commission’s theory of BSP incentives. While there is scant evidence for the theory of bad BSP behavior, there is abundant evidence that openness generally benefits all players throughout the Internet value chain. The Commission cannot ignore this evidence.

The rise of Twitch is a perfect example. In three short years, Twitch went from brand new start-up to the fourth largest single source of traffic on the Internet. Google had previously signed a term sheet with Twitch, but so great was the momentum of this young, tiny company, that it could command a more attractive deal from Amazon. At the time of its acquisition by Amazon, Twitch said it had 55 million unique monthly viewers (consumers) and more than one million broadcasters (producers), generating 15 billion minutes of content viewed a month. According to measurements by the network scientist and Deepfield CEO Craig Labovitz, only Netflix, Google’s YouTube, and Apple’s iTunes generate more traffic.

The Commission’s theory said providers of video content, because of the large bandwidth requirements compared to other content types, were especially vulnerable to bad BSP behavior. Twitch is just such an online video player, yet it achieved hyper-growth and spectacular financial success in the absence of Net Neutrality rules. A firm that didn’t exist at the time of the 2010 Order is born and blossoms to become an Internet giant, courted by at least two of the world’s very largest Internet companies — all in the short time that courts, commissions, and companies are haggling over the rules. This is just one of many pieces of evidence demonstrating start-up firms — specifically start-ups that consume massive amounts of bandwidth — are thriving on the Internet.

Another piece of recent evidence bolsters the case that BSPs have incentives to promote, and in fact maintain, openness. In the second quarter of 2014, cable broadband subscribers for the first time ever outnumbered cable TV subscribers. Broadband is now not just the cable industry’s best product, it is its biggest product. It is popular because consumers can access the diverse bounty of the Web and the Net, and subscribers are voting with their feet.

The health of the Internet economy is a major blow to the theory. In an attempted rebuttal, the Commission might argue that although enforceable rules were not in place, BSPs were operating in an environment in which new rules were a possibility. This possibility, the Commission might assert, encouraged good behavior. Perhaps. Yet new rules to combat or discourage anticompetitive or anti-consumer behavior are always on the table. And many general laws and rules already exist to protect competition and consumers no matter the industry. Perhaps the theory is far less powerful than the NPRM assumes.

The theory of future bad behavior continues to be just that. The Commission is grasping at “might be’s.” But the reality of a healthy Internet economy demonstrates the success of the open Internet every day. The Commission should more heavily weight the mountains of accumulating evidence that BSPs have major incentives to promote openness. Similarly, as evidence piles up against it, the Commission should discount its previous theory of BSP behavior. We may argue over the relative incentives for BSPs to constrain or promote Internet openness. But no legitimate rule making can ignore the substantial incentives in favor of openness.

Given the manifest success of the entire value chain, the Chairman’s proposed case-by-case review process, under Section 706, is far preferable to the intrusive omni-regulatory regime of Title II.

Wireless Is Different

The Commission has so far wisely chosen not to apply its heaviest Net Neutrality rules to wireless networks. But it has asked for comment on the proposal that it do so.

A new paper by Jeffrey H. Reed and Nishith D. Tripathi shows just how complex today’s mobile networks are — and how they require even more intensive network management than wired networks. It adds to the overwhelming testimony of the technical community that “wireless is different” and that wireless networks, businesses, and devices would be especially harmed by intrusive Net Neutrality rules.

The number of wireless devices is moving quickly past 10 billion connections. In several years, the Internet of Everything could grow to 30, 50, or even 100 billion devices, nearly all connected wirelessly. The sheer numbers will only exacerbate the existing complexity of wireless networking. “From millisecond to millisecond,” write Reed and Tripathi,

handsets with differing capabilities, consumers with different usage patterns, applications that utilize different aspects and capabilities of both the handset and the network, and content consumption, including video, must be integrated with the network and managed adroitly to deliver a world-class broadband experience for the customer. Now imagine that millisecond to millisecond process happening while the consumer is in motion, while the handsets vary in capability (think flip-phone to smartphone), while the available network changes from 3G to 4G and from one available spectrum band to another, while traffic moves into and out of a cell sector, and while spectrum capacity is limited. This entire process — the integration of all these different variables — is unique to mobile broadband.

Now imagine adding dozens of new types of devices to the network, generating and consuming many types of data, with varied capacity, latency, and jitter requirements. All interacting on and moving between networks using licensed and unlicensed spectrum. All posing increasingly intense challenges of radio interference and data congestion.

Like the example of Twitch, the mobile Internet is a demonstrable success story. It is, however, even more vulnerable to misguided regulation. The burden of proof is on those who would impose regulation to show that new rules would somehow improve wireless from its existing position of strength, and that new rules, contrary to the overwhelming witness of the technical community, would not harm the mobile arena.

Netflix, Mozilla, and Title II

Two of the most prominent and forceful advocates of new Internet regulation are Netflix, the movie and TV-show streaming firm, and Mozilla, maker of the Firefox web browser. Though differing on a few details, each organization has proposed regulating the Internet as a Title II monopoly telephone service.

We admire both organizations for their innovative contributions to the digital universe. Because they are leading the charge for the government to oversee the Internet as never before, however, it is important to understand — and to refute, where warranted — their positions. Here we select and scrutinize just a few of the technical and economic arguments and assertions from their first-round comments.

Mozilla says: the FCC should “recognize a new type of service” — a so-called “remote delivery service,” defined as the connection between an “edge provider” and a broadband ISP’s subscriber. This downstream link would be regulated as a common carrier under Title II.

Mozilla thinks defining a new remote delivery service can both avoid the fraught re-classification of traditional broadband links and also wall off the rest of the Internet from the very real burdens of Title II. It seems to us not just a bad idea substantively, but too clever for its own good. For starters, in the many-to-many world that Mozilla describes, everyone is an edge provider in some sense. This makes it hard to avoid that, despite its best intentions, every network link will get swallowed up by Title II. Even Netflix says, correctly, that the “universe of potential edge providers is extremely heterogeneous.”

Mozilla uses an analogy in which a “doorman in a high-end condominium” holds package deliveries for the condo residents. The broadband ISP is the doorman, in Mozilla’s story, and his only job is to forward the packages to the residents. He may not charge the sender of the package to speed the delivery to Mrs. Smith on the 18th floor, nor can he threaten to slow down the package absent payment. But ISPs are not passive doormen or toll booth operators, and their broadband policy statements all commit not to degrade anyone’s service. They invest $60 billion in the U.S. each year to build networks, data centers, software, and services. The analogy isn’t perfect, but an ISP is in reality more like FedEx. It takes a lot of money to build the infrastructure to transport packages, or bits, and customers pay for the service.

One of the motivations behind Mozilla’s “remote delivery service” definition, it says, is to protect everyone else in the ecosystem from the ravages of Title II. Such an admission is a deep self-indictment. It is difficult to see how the proposal is anything more than a tool to regulate one’s business rivals and/or suppliers — a decidedly non-neutral policy.

Mozilla says: a determination that bans prioritization “would not prevent network operators from seeking new revenue models, or enabling services that require higher standards for delivery. It would instead require these services to be separated from the access service and structured as specialized services. So long as such services do not generate congestion or degrade traffic for the access service, they would fall outside the scope of Title II classification proposed in the Mozilla petition.”

The 2010 Open Internet rules addressed this point and made room for specialized or managed services outside the scope of net neutrality. We suppose this is better than not allowing room for special services that might require higher levels of capacity, or lower latency tolerances, or other premium options. We addressed this carve out idea in Reply Comments in November of 2010:

“The Commission should consider several unintended consequences of moving down the path of explicitly defining, and then exempting, particular ‘specialized’ services while choosing to regulate the so-called ‘basic,’‘best-effort,’ or ‘entry level’‘open Internet.’

“Regulating the ‘basic’ Internet but not ‘specialized’ services will surely push most of the network and application innovation and investment into the unregulated sphere. A ‘specialized’ exemption, although far preferable to a Net Neutrality world without such an exemption, would tend to incentivize both CAS [content, application, and service] providers and ISP service providers to target the ‘specialized’ category and thus shrink the scope of the ‘open Internet.’

“In fact, although specialized services should and will exist, they often will interact with or be based on the ‘basic’ Internet. Finding demarcation lines will be difficult if not impossible. In a world of vast overlap, convergence, integration, and modularity, attempting to decide what is and is not ‘the Internet’ is probably futile and counterproductive. The very genius of the Internet is its ability to connect to, absorb, accommodate, and spawn new networks, applications and services. In a great compliment to its virtues, the definition of the Internet is constantly changing.

“Moreover, a regime of rigid quarantine would not be good for consumers. If a CAS provider or ISP has to build a new physical or logical network, segregate services and software, or develop new products and marketing for a specifically defined ‘specialized’ service, there would be a very large disincentive to develop and offer simple innovations and new services to customers over the regulated ‘basic’ Internet. Perhaps a consumer does not want to spend the extra money to jump to the next tier of specialized service. Perhaps she only wants the service for a specific event or a brief period of time. Perhaps the CAS provider or ISP can far more economically offer a compelling service over the ‘basic’ Internet with just a small technical tweak, where a leap to a full-blown specialized service would require more time and money, and push the service beyond the reach of the consumer. The transactions costs of imposing a ‘specialized’ quarantine would reduce technical and economic flexibility on both CAS providers and ISPs and, most crucially, on consumers.

“Or, as we wrote in our previous Reply Comments about a related circumstance, ‘A prohibition of the voluntary partnerships that are likely to add so much value to all sides of the market – service provider, content creator, and consumer – would incentivize the service provider to close greater portions of its networks to outside content, acquire more content for internal distribution, create more closely held “managed services” that meet the standards of the government’s “exclusions,” and build a new generation of larger, more exclusive “walled gardens” than would otherwise be the case. The result would be to frustrate the objective of the proceeding. The result would be a less open Internet.’

“It is thus possible that a policy seeking to maintain some pure notion of a basic ‘open Internet’ could severely devalue the open Internet the Commission is seeking to preserve.”

Mozilla says: it urges “the Commission to ban paid prioritization and to apply the same open Internet rules to mobile wireless access services as to fixed services.”

Even technicians who have supported robust net neutrality regulation say applying the rules to wireless would be a mistake. The 2010 Open Internet rules exempted wireless. And for good reason. Wireless is a tricky and constrained environment. Wireless technologies use all sorts of prioritization schemes to ration capacity on what are shared networks. Mozilla says it would allow for reasonable network management techniques. But a host of other technical and commercial arrangements could be put in jeopardy. For example, what about “sponsored data” plans where content firms like ESPN could subsidize a user? In January, AT&T announced a sponsored data template, and in the past month T-Mobile has partnered with several digital music providers. The Mozilla and Netflix proposals could ban such partnerships that provide value to all three parties — consumer, network, and content provider.

Mozilla says: “To contend that edge providers offer nothing of value to access service providers would go against the Commission’s core broadband tenets as well as common sense.”

No one contends this.

Mozilla says: failure to enact its favored policy could produce “an outcry from public interest organizations and technology companies citing promises that were broken.”

This is an odd justification for a push to regulate a healthy industry.

Netflix says: “There can be no doubt that Verizon owns and controls the interconnections that mediate how fast Netflix servers respond to a Verizon Internet access customer’s request.”

This is false. As Netflix correctly notes just paragraphs before, “It is called the Inter-net for a reason. That is, the Inter-net comprises interconnections between many autonomous networks” An inter-connection between two networks means precisely that the two “autonomous” networks have agreed to terms to connect. By its nature, no single entity “owns and controls the interconnections.” It is a partnership. The journey of an Internet data packet, or stream of many packets, moreover, usually takes place over multiple networks, thus traversing several interconnections. In fact, factors outside the ownership and control of last mile ISPs are often most crucial to the quality and speed of Netflix streams (see “Netflix and the Net Neutrality Promotional Vehicle”).

Netflix says: “ISPs, not online content providers, set the universe of available pathways into their networks.”

This is only partially true. Yes, ISPs determine with whom they interconnect. But the existence of other successful networks sets the universe of possible pathways, and the economics and culture of the Net mean broadband ISPs want their customers to reach as much content as possible, so ISPs in general want to connect to lots of other networks. Regardless, Netflix has often chosen to use congested pathways into the broadband ISPs, even though a large number of other well known, capacious pathways (CDNs, transit providers) were also available. In most of the cases when Netflix’s service seemed slow, it was these poor network architecture choices that caused deterioration in “how fast Netflix servers respond[ed]” to an “Internet access customer’s request.”

Netflix says: “There is still one and only one way to reach Comcast’s subscribers: through Comcast.”

Netflix similarly has a monopoly in the market of Netflix customers.

Netflix says: “Prioritization has value only in a congested network.” The ability to prioritize “creates a perverse incentive for ISPs to forego network upgrades in order to give prioritization value.” And in a similar vein, “Prioritization is inherently a zero-sum practice.”

First, it must be said that paid priority is getting far too much attention. It’s not really the key question. We may use prioritization techniques for some applications in the future — HD video conferencing, gaming, remote medical procedures — but most broadband ISPs do not today prioritize much, if any, traffic on their last mile access links. It’s just not the central point of contention so many have made it to be.

Second, priority is a commonplace concept. It’s true, in a world of unlimited supply, priority doesn’t matter. In the real world, it does. We prioritize in every business setting, and in everyday life. We certainly prioritize on the Internet. Voice over IP packets get tagged. Websites and online video providers use content delivery networks (CDNs) for faster delivery. Financial firms build direct fiber links to speed stock market trades. The examples are endless: FedEx’s next morning delivery versus three-day ground. First class versus coach. Airplane versus automobile. Now versus later. It’s crucial that we’re allowed to pay more — and that we’re allowed to pay less when we don’t want or need immediacy.

Third, the argument is a bit circular. And it’s not supported by good economics. The theory is that ISPs will offer an increasingly dilapidated product to consumers so that they can charge content providers for fast lanes. But consumers do have other choices, and dilapidated products aren’t popular. We have multiple wireline choices, and multiple wireless choices that are increasingly robust substitutes. Are broadband service providers really eager to anger their huge customer base in order to make a few extra bucks from a relatively small number of content providers? The math doesn’t look good.

The FCC NPRM, however, asserted, without empirical or theoretical foundation, that ISPs have an incentive to underinvest, congest the network, and degrade service. The FCC did not contemplate, let alone give ample weight to, counter arguments and facts showing incentives working in just the opposite, and much happier, direction.

If we make broadband a highly regulated industry, however, we can expect less market entry, less competition, less investment, less new capacity. (See the experience of Europe today.) A world of artificial scarcity will prompt more stingy prioritization schemes (rationing) than a world of investment and innovation, though some forms of priority will exist in any world this side of heaven.

Priority, price discrimination, product differentiation — these things actually allow us to match consumers with their needs and to create an economically rational system that can support growth.

Contrary to blanket assertions, there are many small start-ups who might value various forms of paid priority, sponsored data, or premium services. Perhaps these tools will help them launch into markets faster than they otherwise would. They may not have the large in-house data centers and CDN networks of a Google or Netflix, so perhaps they utilize third party CDN services or establish partnerships or buy super-fast connections.

Lastly, priority is not zero-sum. To the extent consumers and businesses are allowed to pay for priority (and save money when we don’t need it), the value of the entire system increases and allows further investment. Don’t force grandma who checks her email once a day to subsidize the affluent round-the-clock video gamer.

Digital Dynamism

Wednesday, November 13th, 2013

See our new 20-page report – Digital Dynamism: Competition in the Internet Ecosystem:

The Internet is altering the communications landscape even faster than most imagined.

Data, apps, and content are delivered by a growing and diverse set of firms and platforms, interconnected in ever more complex ways. The new network, content, and service providers increasingly build their varied businesses on a common foundation — the universal Internet Protocol (IP). We thus witness an interesting phenomenon — the divergence of providers, platforms, services, content, and apps, and the convergence on IP.

The dynamism of the Internet ecosystem is its chief virtue. Infrastructure, services, and content are produced by an ever wider array of firms and platforms in overlapping and constantly shifting markets.

The simple, integrated telephone network, segregated entertainment networks, and early tiered Internet still exist, but have now been eclipsed by a far larger, more powerful phenomenon. A new, horizontal, hypercon- nected ecosystem has emerged. It is characterized by large investments, rapid innovation, and extreme product differentiation.

  • Consumers now enjoy at least five distinct, competing modes of broadband connectivity — cable modem, DSL, fiber optic, wireless broadband, and satellite — from at least five types of firms. Widespread wireless Wi- Fi nodes then extend these broadband connections.
  • Firms like Google, Microsoft, Amazon, Apple, Facebook, and Netflix are now major Internet infrastructure providers in the form of massive data centers, fiber networks, content delivery systems, cloud computing clusters, ecommerce and entertainment hubs, network protocols and software, and, in Google’s case, fiber optic access net- works. Some also build network devices and operating systems. Each competes to be the hub — or at least a hub — of the consumer’s digital life. So large are these new players that up to 80 percent of network traffic now bypasses the traditional public Internet backbone.
  • Billions of diverse consumer and enterprise devices plug into these networks, from PCs and laptops to smartphones and tablets, from game consoles and flat panel displays to automobiles, web cams, medical devices, and untold sensors and industrial machines.

The communications playing field is continually shifting. Cable disrupted telecom through broadband cable modem services. Mobile is a massively successful business, yet it is cannibalizing wireline services, with further disruptions from Skype and other IP communications apps. Mobile service providers used to control the handset market, but today handsets are mobile computers that wield their own substantial power with consumers. While the old networks typically delivered a single service — voice, video, or data — today’s broadband networks deliver multiple services, with the “Cloud” offering endless possibilities.

Also view the accompanying graphic, showing the progression of network innovation over time: Hyperconnected: The New Network Map.

U.S. Share of Internet Traffic Grows

Thursday, October 10th, 2013

Over the last half decade, during a protracted economic slump, we’ve documented the persistent successes of Digital America — for example the rise of the App Economy. Measuring the health of our tech sectors is important, in part because policy agendas are often based on assertions of market failure (or regulatory failure) and often include comparisons with other nations. Several years ago we developed a simple new metric that we thought better reflected the health of broadband in international comparisons. Instead of measuring broadband using “penetration rates,” or the number of  connections per capita, we thought a much better indicator was actual Internet usage. So we started looking at Internet traffic per capita and per Internet user (see here, here, here, and, for more context, here).

We’ve update the numbers here, using Cisco’s Visual Networking Index for traffic estimates and Internet user figures from the International Telecommunications Union. And the numbers suggest the U.S. digital economy, and its broadband networks, are healthy and extending their lead internationally. (Patrick Brogan of USTelecom has also done excellent work on this front; see his new update.)

If we look at regional comparisons of traffic per person, we see North America generates and consumes nearly seven times the world average and more around two and a half times that of Western Europe.

Looking at individual nations, and switching to the metric of traffic per user, we find that the U.S. is actually pulling away from the rest of the world. In our previous reports, the U.S. trailed only South Korea, was essentially tied with Canada, and generated around 60-70% more traffic than Western European nations. Now, the U.S. has separated itself from Canada and is generating two to three times the traffic per user of Western Europe and Japan.

Perhaps the most remarkable fact, as Brogan notes, is that the U.S. has nearly caught up with South Korea, which, for the last decade, was a real outlier — far and away the worldwide leader in Internet infrastructure and usage.

Traffic is difficult to measure and its nature and composition can change quickly. There are a number of factors we’ll talk more about later, such as how much of this traffic originates in the U.S. but is destined for foreign lands. Yet these are some of the best numbers we have, and the general magnitudes reinforce the idea that the U.S. digital economy, under a relatively light-touch regulatory model, is performing well.

Discussing Broadband and Economic Growth at AEI

Sunday, September 22nd, 2013

On Tuesday this week, the American Enterprise Institute launched an exciting new project — the Center for Internet, Communications, and Technology. I was happy to participate in the inaugural event, which included talks by CEA chairman Jason Furman and Rep. Greg Walden (R-OR). We discussed broadband’s potential to boost economic productivity and focused on the importance and key questions of wireless spectrum policy. See the video below:

A Decade Later, Net Neutrality Goes To Court

Monday, September 9th, 2013

Today the D.C. Federal Appeals Court hears Verizon’s challenge to the Federal Communications Commission’s “Open Internet Order” — better known as “net neutrality.”

Hard to believe, but we’ve been arguing over net neutrality for a decade. I just pulled up some testimony George Gilder and I prepared for a Senate Commerce Committee hearing in April 2004. In it, we asserted that a newish “horizontal layers” regulatory proposal, then circulating among comm-law policy wonks, would become the next big tech policy battlefield. Horizontal layers became net neutrality, the Bush FCC adopted the non-binding Four Principles of an open Internet in 2005, the Obama FCC pushed through actual regulations in 2010, and now today’s court challenge, which argues that the FCC has no authority to regulate the Internet and that, in fact, Congress told the FCC not to regulate the Internet.

Over the years we’ve followed the debate, and often weighed in. Here’s a sampling of our articles, reports, reply comments, and even some doggerel:

— Bret Swanson

U.S. Mobile: Effectively competitive? Probably. Positively healthy? Absolutely.

Tuesday, March 26th, 2013

Each year the Federal Communications Commission is required to report on competition in the mobile phone market. Following Congress’s mandate to determine the level of industry competition, the FCC, for many years, labeled the industry “effectively competitive.” Then, starting a few years ago, the FCC declined to make such a determination. Yes, there had been some consolidation, it was acknowledged, yet the industry was healthier than ever — more subscribers, more devices, more services, lots of innovation. The failure to achieve the “effectively competitive” label was thus a point of contention.

This year’s “CMRS” — commercial mobile radio services — report again fails to make a designation, one way or the other. Yet whatever the report lacks in official labels, it more than makes up in impressive data.

For example, it shows that as of October 2012, 97.2% of Americans have access to three or more mobile providers, and 92.8% have access to four or more. As for mobile broadband data services, 97.8% have access to two or more providers, and 91.6% have access to three or more.

Rural America is also doing well. The FCC finds 87% of rural consumers have access to three or more mobile voice providers, and 69.1% have access to four or more. For mobile broadband, 89.9% have access to two or more providers, while 65.4% enjoy access to three or more.

Call this what you will — to most laypeople, these choices count as robust competition. Yet the FCC has a point when it

refrain[s] from providing any single conclusion because such an assessment would be incomplete and possibly misleading in light of the variations and complexities we observe.

The industry has grown so large, with so many interconnected and dynamic players, it may have outgrown Congress’s request for a specific label.

14. Given the Report’s expansive view of mobile wireless services and its examination of competition across the entire mobile wireless ecosystem, we find that the mobile wireless ecosystem is sufficiently complex and multi-faceted that it would not be meaningful to try to make a single, all-inclusive finding regarding effective competition that adequately encompasses the level of competition in the various interrelated segments, types of services, and vast geographic areas of the mobile wireless industry.

Or as economist George Ford of the Phoenix Center put it,

The statute wants a competitive analysis, but as the Commission correctly points out, competition is not the goal, it [is] the means. Better performance is the goal. When the evidence presented in the Sixteenth Report is viewed in this way, the conclusion to be reached about the mobile industry, at least to me, is obvious: the U.S. mobile wireless industry is performing exceptionally well for consumers, regardless of whether or not it satisfies someone’s arbitrarily-defined standard of “effective competition.”

I’m in good company.  Outgoing FCC Chairman Julius Genachowski lists among his proudest achievements that “the U.S. is now the envy of the world in advanced wireless networks, devices, applications, among other areas.

The report shows that in the last decade, U.S. mobile connections have nearly tripled. The U.S. now has more mobile connections than people.

The report also shows per user data consumption more than doubling year to year.

More important, the proliferation of smartphones, which are powerful mobile computers, is the foundation for a new American software industry widely known as the App Economy. We detailed the short but amazing history of the app and its impact on the economy in our report “Soft Power: Zero to 60 Billion in Four Years.” Likewise, these devices and software applications are changing industries that need changing. Last week, experts testified before Congress about mobile health, or mHealth, and we wrote about the coming health care productivity revolution in “The App-ification of Medicine.”

One factor that still threatens to limit mobile growth is the availability of spectrum. The report details past spectrum allocations that have borne fruit, but the pipeline of future spectrum allocations is uncertain. A more robust commitment to spectrum availability and a free-flowing spectrum market would ensure continued investment in networks, content, and services.

What Congress once called the mobile “phone” industry is now a sprawling global ecosystem and a central driver of economic advance. By most measures, the industry is effectively competitive. By any measure, it’s positively healthy.

— Bret Swanson

The Broadband Rooster

Tuesday, March 12th, 2013

FCC chairman Julius Genachowski opens a new op-ed with a bang:

As Washington continues to wrangle over raising revenue and cutting spending, let’s not forget a crucial third element for reining in the deficit: economic growth. To sustain long-term economic health, America needs growth engines, areas of the economy that hold real promise of major expansion. Few sectors have more job-creating innovation potential than broadband, particularly mobile broadband.

Private-sector innovation in mobile broadband has been extraordinary. But maintaining the creative momentum in wireless networks, devices and apps will need an equally innovative wireless policy, or jobs and growth will be left on the table.

Economic growth is indeed the crucial missing link to employment, opportunity, and healthier government budgets. Technology is the key driver of long term growth, and even during the downturn the broadband economy has delivered. Michael Mandel estimates the “app economy,” for example, has created more than 500,000 jobs in less than five short years of existence.

We emphatically do need policies that will facilitate the next wave of digital innovation and growth. Chairman Genachowski’s top line assessment — that U.S. broadband is a success — is important. It rebuts the many false but persistent claims that U.S. broadband lags the world. Chairman Genachowski’s diagnosis of how we got here and his prescriptions for the future, however, are off the mark.

For example, he suggests U.S. mobile innovation is newer than it really is.

Over the past few years, after trailing Europe and Asia in mobile infrastructure and innovation, the U.S. has regained global leadership in mobile technology.

This American mobile resurgence did not take place in just the last “few years.” It began a little more than decade ago with smart decisions to:

(1) allow reasonable industry consolidation and relatively free spectrum allocation, after years of forced “competition,” which mandated network duplication and thus underinvestment in coverage and speed (we did in fact trail Europe in some important mobile metrics in the late 1990s and briefly into the 2000s);

(2) refrain from any but the most basic regulation of broadband in general and the mobile market in particular, encouraging experimental innovation; and

(3) finally implement the digital TV / 700 MHz transition in 2007, which put more of the best spectrum into the market.

These policies, among others, encouraged some $165 billion in mobile capital investment between 2001 and 2008 and launched a wave of mobile innovation. Development on the iPhone began in 2004, the iPhone itself arrived in 2007, and the app store in 2008. Google’s Android mobile OS came along in 2009, the year Mr. Genachowski arrived at the FCC. By this time, the American mobile juggernaut had already been in full flight for years, and the foundation was set — the U.S. topped the world in 3G mobile networks and device and software innovation. Wi-Fi, meanwhile surged from 2003 onward, creating an organic network of tens of millions of wireless nodes in homes, offices, and public spaces. Mr. Genachowski gets some points for not impeding the market as aggressively as some other more zealous regulators might have. But taking credit for America’s mobile miracle smacks of the rooster proudly puffing his chest at sunrise.

More important than who gets the credit, however, is determining what policies led to the current success . . . and which are likely to spur future growth. Chairman Genachowski is right to herald the incentive auctions that could unleash hundreds of megahertz of un- and under-used spectrum from the old TV broadcasters. Yet wrangling over the rules of the auctions could stretch on, delaying the the process. Worse, the rules themselves could restrict who can bid on or buy new spectrum, effectively allowing the FCC to favor certain firms, technologies, or friends at the expense of the best spectrum allocation. We’ve seen before that centrally planned spectrum allocations don’t work. The fact that the FCC is contemplating such an approach is worrisome. It runs counter to the policies that led to today’s mobile success.

The FCC also has a bad habit of changing the metrics and the rules in the middle of the game. For example, the FCC has been caught changing its “spectrum screen” to fit its needs. The screen attempts to show how much spectrum mobile operators hold in particular markets. During M&A reviews, however, the FCC has changed its screen procedures to make the data fit its opinion.

In a more recent example, Fred Campbell shows that the FCC alters its count of total available commercial spectrum to fit the argument it wants to make from day to day. We’ve shown that the U.S. trails other nations in the sum of currently available spectrum plus spectrum in the pipeline. Below, see a chart from last year showing how the U.S. compares favorably in existing commercially available spectrum but trails severely in pipeline spectrum. Translation: the U.S. did a pretty good job unleashing spectrum in 1990s through he mid-2000s. But, contrary to Chairman Genachowski’s implication, it has stalled in the last few years.

When the FCC wants to argue that particular companies shouldn’t be allowed to acquire more spectrum (whether through merger or secondary markets), it adopts this view that the U.S. trails in spectrum allocation. Yet when challenged on the more general point that the U.S. lags other nations, the FCC turns around and includes an extra 139 MHz in spectrum in the 2.5 GHz range to avoid the charge it’s fallen behind the curve.

Next, Chairman Genachowski heralds a new spectrum “sharing” policy where private companies would be allowed to access tiny portions of government-owned airwaves. This really is weak tea. The government, depending on how you measure, controls between 60% and 85% of the best spectrum for wireless broadband. It uses very little of it. Yet it refuses to part with meaningful portions, even though it would still be left with more than enough for its important uses — military and otherwise. If they can make it work (I’m skeptical), sharing may offer a marginal benefit. But it does not remotely fit the scale of the challenge.

Along the way, the FCC has been whittling away at mobile’s incentives for investment and its environment of experimentation. Chairman Genachowski, for example, imposed price controls on “data roaming,” even though it’s highly questionable he had the legal authority to do so. The Commission has also, with varied degrees of “success,” been attempting to impose its extralegal net neutrality framework to wireless. And of course the FCC has blocked, altered, and/or discouraged a number of important wireless mergers and secondary spectrum transactions.

Chairman Genachowski’s big picture is a pretty one: broadband innovation is key to economic growth. Look at the brush strokes, however, and there are reasons to believe sloppy and overanxious regulators are threatening to diminish America’s mobile masterpiece.

— Bret Swanson

Broadband Bullfeathers

Friday, December 14th, 2012

Several years ago, some American lawyers and policymakers were looking for ways to boost government control of the Internet. So they launched a campaign to portray U.S. broadband as a pathetic patchwork of tin-cans-and-strings from the 1950s. The implication was that broadband could use a good bit of government “help.”

They initially had some success with a gullible press. The favorite tools were several reports that measured, nation by nation, the number of broadband connections per 100 inhabitants. The U.S. emerged from these reports looking very mediocre. How many times did we read, “The U.S. is 16th in the world in broadband”? Upon inspection, however, the reports weren’t very useful. Among other problems, they were better at measuring household size than broadband health. America, with its larger households, would naturally have fewer residential broadband subscriptions (not broadband users) than nations with smaller households (and thus more households per capita). And as the Phoenix Center demonstrated, rather hilariously, if the U.S. and other nations achieved 100% residential broadband penetration, America would actually fall to 20th from 15th.

In the fall of 2009, a voluminous report from Harvard’s Berkman Center tried to stitch the supposedly ominous global evidence into a case-closed indictment of U.S. broadband. The Berkman report, however, was a complete bust (see, for example, these thorough critiques: 1, 2, and 3 as well as my brief summary analysis).

Berkman’s statistical analyses had failed on their own terms. Yet it was still important to think about the broadband economy in a larger context. We asked the question, how could U.S. broadband be so backward if so much of the world’s innovation in broadband content, services, and devices was happening here?

To name just a few: cloud computing, YouTube, Twitter, Facebook, Netflix, iPhone, Android, ebooks, app stores, iPad. We also showed that the U.S. generates around 60% more network traffic per capita and per Internet user than Western Europe, the supposed world broadband leader. The examples multiply by the day. As FCC chairman Julius Genachowski likes to remind us, the U.S. now has more 4G LTE wireless subscribers than the rest of the world combined.

Yet here comes a new book with the same general thrust — that the structure of the U.S. communications market is delivering poor information services to American consumers. In several new commentaries summarizing the forthcoming book’s arguments, author Susan Crawford’s key assertion is that U.S. broadband is slow. It’s so bad, she thinks broadband should be a government utility. But is U.S. broadband slow?

According to actual network throughput measured by Akamai, the world’s largest content delivery network, the U.S. ranks in the top ten or 15 across a range of bandwidth metrics. It is ninth in average connection speed, for instance, and 13th in average peak speed. Looking at proportions of populations who enjoy speeds above a certain threshold, Akamai finds the U.S. is seventh in the percentage of connections exceeding 10 megabits per second (Mbps) and 13th in the percentage exceeding 4 Mbps. (See the State of the Internet report, 2Q 2012.)

You may not be impressed with rankings of seventh or 13th. But did you look at the top nations on the list? Hong Kong, South Korea, Latvia, Switzerland, the Netherlands, Japan, etc.

Each one of them is a relatively small, densely populated country. The national rankings are largely artifacts of geography and the size of the jurisdictions observed. Small nations with high population densities fare well. It is far more economical to build high-speed communications links in cities and other relatively dense populations. Accounting for this size factor, the U.S. actually looks amazingly good. Only Canada comes close to the U.S. among geographically larger nations.

But let’s look even further into the data. Akamai also supplies speeds for individual U.S. states. If we merge the tables of nations and states, the U.S. begins to look not like a broadband backwater or even a middling performer but an overwhelming success. Here are the two sets of Akamai data combined into tables that directly compare the successful small nations with their more natural counterparts, the U.S. states (shaded in blue).

Average Broadband Connection Speed — Nine of the top 15 entities are U.S. states.

Average Peak Connection Speed — Ten of the top 15 entities are U.S. states.

Percent of Connections Over 10 Megabits per Second — Ten of the top 15 entities are U.S. states.

Percent of Connections Over 4 Megabits per Second — Ten of the top 16 entities are U.S. states.

Among the 61 ranked entities on these four measures of broadband speed, 39, or almost two-thirds, are U.S. states. American broadband is not “pitifully slow.” In fact, if we were to summarize U.S. broadband, we’d have to say, compared to the rest of the world, it is very fast.

It is true that not every state or region in the U.S. enjoys top speeds. Yes, we need more, better, faster, wider coverage of wired and wireless broadband. In underserved neighborhoods as well as our already advanced areas. We need constant improvement both to accommodate today’s content and services and to drive tomorrow’s innovations. We should not, however, be making broad policy under the illusion that U.S. broadband, taken as a whole, is deficient. The quickest way to make U.S. broadband deficient is probably to enact policies that discourage investment and innovation — such as trying to turn a pretty successful and healthy industry that invests $60 billion a year into a government utility.

— Bret Swanson

The $66-billion Internet Expansion

Thursday, November 8th, 2012

Sixty-six billion dollars over the next three years. That’s AT&T’s new infrastructure plan, announced yesterday. It’s a bold commitment to extend fiber optics and 4G wireless to most of the country and thus dramatically expand the key platform for growth in the modern U.S. economy.

The company specifically will boost its capital investments by an additional $14 billion over previous estimates. This should enable coverage of 300 million Americans (around 97% of the population) with LTE wireless and 75% of AT&T’s residential service area with fast IP broadband. It’s adding 10,000 new cell towers, a thousand distributed antenna systems, and 40,000 “small cells” that augment and extend the wireless network to, for example, heavily trafficked public spaces. Also planned are fiber optic connections to an additional 1 million businesses.

As the company expands its fiber optic and wireless networks — to drive and accommodate the type of growth seen in the chart above — it will be retiring parts of its hundred-year-old copper telephone network. To do this, it will need cooperation from federal and state regulators. This is the end of phone network, the transition to all Internet, all the time, everywhere.

FCC’s 706 Broadband Report Does Not Compute

Wednesday, August 22nd, 2012

Yesterday the Federal Communications Commission issued 181 pages of metrics demonstrating, to any fair reader, the continuing rapid rise of the U.S. broadband economy — and then concluded, naturally, that “broadband is not yet being deployed to all Americans in a reasonable and timely fashion.” A computer, being fed the data and the conclusion, would, unable to process the logical contradictions, crash.

The report is a response to section 706(b) of the 1996 Telecom Act that asks the FCC to report annually whether broadband “is being deployed . . . in a reasonable and timely fashion.” From 1999 to 2008, the FCC concluded that yes, it was. But now, as more Americans than ever have broadband and use it to an often maniacal extent, the FCC has concluded for the third year in a row that no, broadband deployment is not “reasonable and timely.”

The FCC finds that 19 million Americans, mostly in very rural areas, don’t have access to fixed line terrestrial broadband. But Congress specifically asked the FCC to analyze broadband deployment using any technology.”

“Any technology” includes DSL, cable modems, fiber-to-the-x, satellite, and of course fixed wireless and mobile. If we include wireless broadband, the unserved number falls to 5.5 million from the FCC’s headline 19 million. Five and a half million is 1.74% of the U.S. population. Not exactly a headline-grabbing figure.

Even if we stipulate the FCC’s framework, data, and analysis, we’re still left with the FCC’s own admission that between June 2010 and June 2011, an additional 7.4 million Americans gained access to fixed broadband service. That dropped the portion of Americans without access to 6% in 2011 from around 8.55% in 2010 — a 30% drop in the unserved population in one year. Most Americans have had broadband for many years, and the rate of deployment will necessarily slow toward the tail-end of any build-out. When most American households are served, there just aren’t very many to go, and those that have yet to gain access are likely to be in the very most difficult to serve areas (e.g. “on tops of mountains in the middle of nowhere”). The fact that we still added 7.4 million broadband in the last year, lowering the unserved population by 30%, even using the FCC’s faulty framework, demonstrates in any rational world that broadband “is being deployed” in a “reasonable and timely fashion.”

But this is not the rational world — it’s D.C. in the perpetual political silly season.

One might conclude that because the vast majority of these unserved Americans live in very rural areas — Alaska, Montana, West Virginia — the FCC would, if anything, suggest policies tailored to boost infrastructure investment in these hard-to-reach geographies. We could debate whether these are sound investments and whether the government would do a good job expanding access, but if rural deployment is a problem, then presumably policy should attempt to target and remediate the rural underserved. Commissioner McDowell, however, knows the real impetus for the FCC’s tortured no-confidence vote — its regulatory agenda.

McDowell notes that the report repeatedly mentions the FCC’s net neutrality rules (now being contested in court), which are as far from a pro-broadband policy, let alone a targeted one, as you could imagine. If anything, net neutrality is an impediment to broader, faster, better broadband. But the FCC is using its thumbs-down on broadband deployment to prop up its intrusions into a healthy industry. As McDowell concluded, “the majority has used this process as an opportunity to create a pretext to justify more regulation.”

Misunderstanding the Mobile Ecosystem

Thursday, August 9th, 2012

Mobile communications and computing are among the most innovative and competitive markets in the world. They have created a new world of software and offer dramatic opportunities to improve productivity and creativity across the industrial spectrum.

Last week we published a tech note documenting the rapid growth of mobile and the importance of expanding wireless spectrum availability. More clean spectrum is necessary both to accommodate fast-rising demand and drive future innovations. Expanding spectrum availability might seem uncontroversial. In the report, however, we noted that one obstacle to expanding spectrum availability has been a cramped notion of what constitutes competition in the Internet era. As we wrote:

Opponents of open spectrum auctions and flexible secondary markets often ignore falling prices, expanding choices, and new features available to consumers. Instead they sometimes seek to limit new spectrum availability, or micromanage its allocation or deployment characteristics, charging that a few companies are set to dominate the market. Although the FCC found that 77% of the U.S. population has access to three or more 3G wireless providers, charges of a coming “duopoly” are now common.

This view, however, relies on the old analysis of static utility or commodity markets and ignores the new realities of broadband communications. The new landscape is one of overlapping competitors with overlapping products and services, multi-sided markets, network effects, rapid innovation, falling prices, and unpredictability.

Sure enough, yesterday Sprint CEO Dan Hesse made the duopoly charge and helped show why getting spectrum policy right has been so difficult.

Q: You were a vocal opponent of the AT&T/T-Mobile merger. Are you satisfied you can compete now that the merger did not go through?

A: We’re certainly working very hard. There’s no question that the industry does have an issue with the size of the duopoly of AT&T and Verizon. I believe that over time we’ll see more consolidation in the industry outside of the big two, because the gap in size between two and three is so enormous. Consolidation is healthy for the industry as long as it’s not AT&T and Verizon getting larger.

Hesse goes even further.

Hesse also seemed to be likening Sprint’s struggles in competing with AT&T-Rex and Big Red as a fight against good and evil. Sprint wants to wear the white hat, according to Hesse. “At Sprint, we describe it internally as being the good guys, of doing the right thing,” he said.

This type of thinking is always a danger if you’re trying to make sound policy. Picking winners and losers is inevitably — at best — an arbitrary exercise. Doing so based on some notion of corporate morality is plain silly, but even more reasonable sounding metrics and arguments — like those based on market share — are often just as misleading and harmful.

The mobile Internet ecosystem is growing so fast and changing with such rapidity and unpredictability that making policy based on static and narrow market definitions will likely yield poor policy. As we noted in our report:

It is, for example, worth emphasizing: Google and Apple were not in this business just a few short years ago.

Yet by the fourth quarter of 2011 Apple could boast an amazing 75% of the handset market’s profits. Apple’s iPhone business, it was widely noted after Apple’s historic 2011, is larger than all of Microsoft. In fact, Apple’s non-iPhone products are also larger than Microsoft.

Android, the mobile operating system of Google, has been growing even faster than Apple’s iOS. In December 2011, Google was activating 700,000 Android devices a day, and now, in the summer of 2012, it estimates 900,000 activations per day. From a nearly zero share at the beginning of 2009, Android today boasts roughly a 55% share of the global smartphone OS market.

. . .

Apple’s iPhone changed the structure of the industry in several ways, not least the relationships between mobile service providers and handset makers. Mobile operators used to tell handset makers what to make, how to make it, and what software and firmware could be loaded on it. They would then slap their own brand label on someone else’s phone.

Apple’s quick rise to mobile dominance has been matched by Blackberry maker Research In Motion’s fall. RIM dominated the 2000s with its email software, its qwerty keyboard, and its popularity with enterprise IT departments. But it  couldn’t match Apple’s or Android’s general purpose computing platforms, with user-friendly operating systems, large, bright touch-screens, and creative and diverse app communities.

Sprinkled among these developments were the rise, fall, and resurgence of Motorola, and then its sale to Google; the rise and fall of Palm; the rise of HTC; and the decline of once dominant Nokia.

Apple, Google, Amazon, Microsoft, and others are building cloud ecosystems, sometimes complemented with consumer devices, often tied to Web apps and services, multimedia content, and retail stores. Many of these products and services compete with each other, but they also compete with broadband service providers. Some of these business models rely primarily on hardware, some software, some subscriptions, some advertising. Each of the companies listed above — a computer company, a search company, an ecommerce company, and a software company — are now major Internet infrastructure companies.

As Jeffrey Eisenach concluded in a pathbreaking analysis of the digital ecosystem (“Theories of Broadband Competition”), there may be market concentration in one (or more) layer(s) of the industry (broadly considered), yet prices are falling, access is expanding, products are proliferating, and innovation is as rapid as in any market we know.

The Real Deal on U.S. Broadband

Monday, June 11th, 2012

Is American broadband broken?

Tim Lee thinks so. Where he once leaned against intervention in the broadband marketplace, Lee says four things are leading him to rethink and tilt toward more government control.

First, Lee cites the “voluminous” 2009 Berkman Report. Which is surprising. The report published by Harvard’s Berkman Center may have been voluminous, but it lacked accuracy in its details and persuasiveness in its big-picture take-aways. Berkman used every trick in the book to claim “open access” regulation around the world boosted other nation’s broadband economies and lack of such regulation in the U.S. harmed ours. But the report’s data and methodology were so thoroughly discredited (especially in two detailed reports issued by economists Robert Crandall, Everett Ehrlich, and Jeff Eisenach and Robert Hahn) that the FCC, which commissioned the report, essentially abandoned it.  Here was my summary of the economists’ critiques:

The [Berkman] report botched its chief statistical model in half a dozen ways. It used loads of questionable data. It didn’t account for the unique market structure of U.S. broadband. It reversed the arrow of time in its country case studies. It ignored the high-profile history of open access regulation in the U.S. It didn’t conduct the literature review the FCC asked for. It excommunicated Switzerland.

. . .

Berkman’s qualitative analysis was, if possible, just as misleading. It passed along faulty data on broadband speeds and prices. It asserted South Korea’s broadband boom was due to open access regulation, but in fact most of South Korea’s surge happened before it instituted any regulation. The study said Japanese broadband, likewise, is a winner because of regulation. But regulated DSL is declining fast even as facilities-based (unshared, proprietary) fiber-to-the-home is surging.

Berkman also enjoyed comparing broadband speeds of tiny European and Asian countries to the whole U.S. But if we examine individual American states — New York or Arizona, for example — we find many of them outrank most European nations and Europe as a whole. In fact, applying the same Speedtest.com data Berkman used, the U.S. as a whole outpaces Europe as a whole! Comparing small islands of excellence to much larger, more diverse populations or geographies is bound to skew your analysis.

The Berkman report twisted itself in pretzels trying to paint a miserable picture of the U.S. Internet economy and a glowing picture of heavy regulation in foreign nations. Berkman, however, ignored the prima facie evidence of a vibrant U.S. broadband marketplace, manifest in the boom in Web video, mobile devices, the App Economy, cloud computing, and on and on.

How could the bulk of the world’s best broadband apps, services, and sites be developed and achieve their highest successes in the U.S. if American broadband were so slow and thinly deployed? We came up with a metric that seemed to refute the notion that U.S. broadband was lagging, namely, how much network traffic Americans generate vis-à-vis the rest of the world. It turned out the U.S. generates more network traffic per capita and per Internet user than any nation but South Korea and generates about two-thirds more per-user traffic than the closest advanced economy of comparable size, Western Europe.

Berkman based its conclusions almost solely on (incorrect) measures of “broadband penetration” — the number of broadband subscriptions per capita — but that metric turned out to be a better indicator of household size than broadband health. Lee acknowledges the faulty analysis but still assumes “broadband penetration” is the sine qua non measure of Internet health. Maybe we’re not awful, as Berkman claimed, Lee seems to be saying, but even if we correct for their methodological mistakes, U.S. broadband penetration is still just OK. “That matters,” Lee writes,

because a key argument for America’s relatively hands-off approach to broadband regulation has been that giving incumbents free rein would give them incentive to invest more in their networks. The United States is practically the only country to pursue this policy, so if the incentive argument was right, its advocates should have been able to point to statistics showing we’re doing much better than the rest of the world. Instead, the argument has been over just how close to the middle of the pack we are.

No, I don’t agree that the argument has consisted of bickering over whether the U.S. is more or less mediocre. Not at all. I do agree that advocates of government regulation have had to adjust their argument – U.S. broadband is awful mediocre. Yet they still hang their hat on “broadband penetration” because most other evidence on the health of the U.S. digital economy is even less supportive of their case.

In each of the last seven years, U.S. broadband providers have invested between $60 and $70 billion in their networks. Overall, the U.S. leads the world in info-tech investment — totaling nearly $500 billion last year. The U.S. now boasts more than 80 million residential broadband links and 200+ million mobile broadband subscribers. U.S. mobile operators have deployed more 4G mobile network capacity than anyone, and Verizon just announced its FiOS fiber service will offer 300 megabit-per-second residential connections — perhaps the fastest large-scale deployment in the world.

Eisenach and Crandall followed up their critique of the Berkman study with a fresh March 2012 analysis of “open access” regulation around the world (this time with Allan Ingraham). They found:

  • “it is clear that copper loop unbundling did not accelerate the deployment or increase the penetration of first-generation broadband networks, and that it had a depressing effect on network investment”
  • “By contrast, it seems clear that platform competition was very important in promoting broadband deployment and uptake in the earlier era of DSL and cable modem competition.”
  • “to the extent new fiber networks are being deployed in Europe, they are largely being deployed by unregulated, non-ILEC carriers, not by the regulated incumbent telecom companies, and not by entrants that have relied on copper-loop unbundling.”

Lee doesn’t mention the incisive criticisms of the Berkman study nor the voluminous literature, including this latest example, showing open access policies are ineffective at best, and more likely harmful.

In coming posts, I’ll address Lee’s three other worries.

— Bret Swanson

New iPad, Fellow Bandwidth Monsters Hungry for More Spectrum

Tuesday, March 13th, 2012

Last week Apple unveiled its third-generation iPad. Yesterday the company said the LTE versions of the device, which can connect via Verizon and AT&T mobile broadband networks, are sold out.

It took 15 years for laptops to reach 50 million units sold in a year. It took smartphones seven years. For tablets (not including Microsoft’s clunky attempt a decade ago), just two years. Mobile device volumes are astounding. In each of the last five years, global mobile phone sales topped a billion units. Last year smartphones outsold PCs for the first time – 488 million versus 432 million. This year well over 500 million smartphones and perhaps 100 million tablets could be sold.

Smartphones and tablets represent the first fundamentally new consumer computing platforms since the PC, which arrived in the late ’70s and early ’80s. Unlike mere mobile phones, they’ve got serious processing power inside. But their game-changing potency is really based on their capacity to communicate via the Internet. And this power is, of course, dependent on the cloud infrastructure and wireless networks.

But are wireless networks today prepared for this new surge of bandwidth-hungry mobile devices? Probably not. When we started to build 3G mobile networks in the middle of last decade, many thought it was a huge waste. Mobile phones were used for talking, and some texting. They had small low-res screens and were terrible at browsing the Web. What in the world would we do with all this new wireless capacity? Then the iPhone came, and, boom — in big cities we went from laughable overcapacity to severe shortage seemingly overnight. The iPhone’s brilliant screen, its real Web browsing experience, and the world of apps it helped us discover totally changed the game. Wi-Fi helped supply the burgeoning iPhone with bandwidth, and Wi-Fi will continue to grow and play an important role. Yet Credit Suisse, in a 2011 survey of the industry, found that mobile networks overall were running at 80% of capacity and that many network nodes were tapped out.

Today, we are still expanding 3G networks and launching 4G in most cities. Verizon says it offers 4G LTE in 196 cities, while AT&T says it offers 4G LTE in 28 markets (and combined with its HSPA+ networks offers 4G-like speeds to 200 million people in the U.S.). Lots of things affect how fast we can build new networks — from cell site permitting to the fact that these things are expensive ($20 billion worth of wireless infrastructure in the U.S. last year). But another limiting factor is spectrum availability.

Do we have enough radio waves to efficiently and cost-effectively serve these hundreds of millions of increasingly powerful mobile devices, which generate and consume increasingly rich content, with ever more stringent latency requirements, and which depend upon robust access to cloud storage and computing resources?

Capacity is a function of money, network nodes, technology, and radio waves. But spectrum is grossly misallocated. The U.S. government owns 61% of the best airwaves, while mobile broadband providers — where all the action is — own just 10%. Another portion is controlled by the old TV broadcasters, where much of this beachfront spectrum lay fallow or underused.

They key is allowing spectrum to flow to its most valuable uses. Last month Congress finally authorized the FCC to conduct incentive auctions to free up some unused and underused TV spectrum. Good news. But other recent developments discourage us from too much optimism on this front.

In December the FCC and Justice Department vetoed AT&T’s attempt to augment its spectrum and cell-site position via merger with T-Mobile. Now the FCC and DoJ are questioning Verizon’s announced purchase of Spectrum Co. — valuable but unused spectrum owned by a consortium of cable TV companies. The FCC has also threatened to tilt any spectrum auctions so that it decides who can bid, how much bidders can buy, and what buyers may or may not do with their spectrum — pretending Washington knows exactly how this fast-changing industry should be structured, thus reducing the value of spectrum and probably delaying availability of new spectrum and possibly reducing the sector’s pace of innovation.

It’s very difficult to see how it’s at all productive for the government to block companies who desperately need more spectrum from buying it from those who don’t want it, don’t need it, or can’t make good use of it. The big argument against AT&T and Verizon’s attempted spectrum purchases is “competition.” But T-Mobile wanted to sell to AT&T because it admitted it didn’t have the financial (or spectrum) wherewithal to build a super expensive 4G network. Apparently the same for the cable companies, who chose to sell to Verizon. Last week Dish Network took another step toward entering the 4G market with the FCC’s approval of spectrum transfers from two defunct companies, TerreStar and DBSD.

Some people say the proliferation of Wi-Fi or the increased use of new wireless technologies that economize on spectrum will make more spectrum availability unnecessary. I agree Wi-Fi is terrific and will keep growing and that software radios, cognitive radios, mesh networks and all the other great technologies that increase the flexibility and power of wireless will make big inroads. So fine, let’s stipulate that perhaps these very real complements will reduce the need for more spectrum at the margin. Then the joke is on the big companies that want to overpay for unnecessary spectrum. We still allow big, rich companies to make mistakes, right? Why, then, do proponents of these complementary technologies still oppose allowing spectrum to flow to its highest use?

Free spectrum auctions would allow lots of companies to access spectrum — upstarts, middle tier, and yes, the big boys, who desperately need more capacity to serve the new iPad.

— Bret Swanson

Is the FCC serious about more wireless spectrum? Apparently not.

Friday, January 13th, 2012

For the third year in a row, FCC chairman Julius Genachowski used his speech at the Consumer Electronics Show in Las Vegas to push for more wireless spectrum. He wants Congress to pass the incentive auction law that would unleash hundreds of megahertz of spectrum to new and higher uses. Most of Congress agrees: we need lots more wireless capacity and spectrum auctions are a good way to get there.

Genachowski, however, wants overarching control of the new spectrum and, by extension, the mobile broadband ecosystem. The FCC wants the authority to micromanage the newly available radio waves — who can buy it, how much they can buy, how they can use it, what content flows over it, what business models can be employed with it. But this is an arena that is growing wildly fast, where new technologies appear every day, and where experimentation is paramount to see which business models work. Auctions are supposed to be a way to get more spectrum into the marketplace, where lots of companies and entrepreneurs can find the best ways to use it to deliver new communications services. ”Any restrictions” by Congress on the FCC “would be a real mistake,” said Genachowski. In other words, he doesn’t want Congress to restrict his ability to restrict the mobile business. It seems the liberty of regulators to act without restraint is a higher virtue than the liberty of private actors.

At the end of 2011, the FCC and Justice Department vetoed AT&T’s proposed merger with T-Mobile, a deal that would have immediately expanded 3G mobile capacity across the nation and accelerated AT&T’s next generation 4G rollout by several years. That deal was all about a more effective use of spectrum, more cell towers, more capacity to better serve insatiable smart-phone and tablet equipped consumers. Now the FCC is holding hostage the spectrum auction bill with its my-way-or-the-highway approach. And one has to ask: Is the FCC really serious about spectrum, mobile capacity, and a healthy broadband Internet?

— Bret Swanson

Why is the FCC playing procedural games?

Wednesday, November 30th, 2011

America is in desperate need of economic growth. But as the U.S. economy limps along, with unemployment stuck at 9%, the Federal Communications Commission is playing procedural tiddlywinks with the nation’s largest infrastructure investor, in the sector of the economy that offers the most promise for innovation and 21st century jobs. In normal times, we might chalk this up to clever Beltway maneuvering. But do we really have the time or money to indulge bureaucratic gamesmanship?

On Thanksgiving Eve, the FCC surprised everyone. It hadn’t yet completed its investigation into the proposed AT&T-T-Mobile wireless merger, and the parties had not had a chance to discuss or rebut the agency’s initial findings. Yet the FCC preempted the normal process by announcing it would send the case to an administrative law judge — essentially a vote of no-confidence in the deal. I say “vote,” but  the FCC commissioners hadn’t actually voted on the order.

FCC Chairman Julius Genachowski called AT&T CEO Randall Stevenson, who, on Thanksgiving Day, had to tell investors he was setting aside $4 billion in case Washington blocked the deal.

The deal is already being scrutinized by the Department of Justice, which sued to block the merger last summer. The fact that telecom mergers and acquisitions must negotiate two levels of federal scrutiny, at DoJ and FCC, is already an extra burden on the Internet industry. But when one agency on this dual-track games the system by trying to influence the other track — maybe because the FCC felt AT&T had a good chance of winning its antitrust case — the obstacles to promising economic activity multiply.

After the FCC’s surprise move, AT&T and T-Mobile withdrew their merger application at the FCC. No sense in preparing for an additional hearing before an administrative law judge when they are already deep in preparation for the antitrust trial early next year. Moreover, the terms of the merger agreement are likely to have changed after the companies (perhaps) negotiate conditions with the DoJ. They’d have to refile an updated application anyway. Not so fast, said the FCC. We’re not going to allow AT&T and T-Mobile to withdraw their application. Or we if we do allow it, we will do so “with prejudice,” meaning the parties can’t refile a revised application at a later date. On Tuesday the FCC relented — the law is clear: an applicant has the right to withdraw an application without consent from the FCC. But the very fact the FCC initially sought to deny the withdrawal is itself highly unusual. Again, more procedural gamesmanship.

If that weren’t enough, the FCC then said it would release its “findings” in the case — another highly unusual (maybe unprecedented) action. The agency hadn’t completed its process, and there had been no vote on the matter. So the FCC instead released what it calls a “staff report” — a highly critical internal opinion that hadn’t been reviewed by the parties nor approved by the commissioners. We’re eager to analyze the substance of this “staff report,” but the fact the FCC felt the need to shove it out the door was itself remarkable.

It appears the FCC is twisting legal procedure any which way to fit its desired outcome, rather than letting the normal merger process play out. Indeed, “twisting legal procedure” may be too kind. It has now thrown law and procedure out the window and is in full public relations mode. These extralegal PR games tilt the playing field against the companies, against investment and innovation, and against the health of the U.S. economy.

— Bret Swanson

World Broadband Update

Tuesday, June 28th, 2011

The OECD published its annual Communications Outlook last week, and the 390 pages offer a wealth of information on all-things-Internet — fixed line, mobile, data traffic, price comparisons, etc. Among other remarkable findings, OECD notes that:

In 1960, only three countries — Canada, Sweden and the United States — had more than one phone for every four inhabitants. For most of what would become OECD countries a year later, the figure was less than 1 for every 10 inhabitants, and less than 1 in 100 in a couple of cases. At that time, the 84 million telephones in OECD countries represented 93% of the global total. Half a century later there are 1.7 billion telephones in OECD countries and a further 4.1 billion around the world. More than two in every three people on Earth now have a mobile phone.

Very useful stuff. But in recent times the report has also served as a chance for some to misrepresent the relative health of international broadband markets. The common refrain the past several years was that the U.S. had fallen way behind many European and Asian nations in broadband. The mantra that the U.S. is “15th in the world in broadband” — or 16th, 21st, 24th, take your pick — became a sort of common lament. Except it wasn’t true.

As we showed here, the second half of the two-thousand-aughts saw an American broadband boom. The Phoenix Center and others showed that the most cited stat in those previous OECD reports — broadband connections per 100 inhabitants — actually told you more about household size than broadband. And we developed metrics to better capture the overall health of a nation’s Internet market — IP traffic per Internet user and per capita.

Below you’ll see an update of the IP traffic per Internet user chart, built upon Cisco’s most recent (June 1, 2011) Visual Networking Index report. The numbers, as they did last year, show the U.S. leads every region of the world in the amount of IP traffic we generate and consume both in per user and per capita terms. Among nations, only South Korea tops the U.S., and only Canada matches the U.S.

Although Asia contains broadband stalwarts like Korea, Japan, and Singapore, it also has many laggards. If we compare the U.S. to the most uniformly advanced region, Western Europe, we find the U.S. generates 62% more traffic per user. (These figures are based on Cisco’s 2010 traffic estimates and the ITU’s 2010 Internet user numbers.)

As we noted last year, it’s not possible for the U.S. to both lead the world by a large margin in Internet usage and lag so far behind in broadband. We think these traffic per user and per capita figures show that our residential, mobile, and business broadband networks are among the world’s most advanced and ubiquitous.

Lots of other quantitative and qualitative evidence — from our smart-phone adoption rates to the breakthrough products and services of world-leading device (Apple), software (Google, Apple), and content companies (Netflix) — reaffirms the fairly obvious fact that the U.S. Internet ecosystem is in fact healthy, vibrant, and growing. Far from lagging, it leads the world in most of the important digital innovation indicators.

— Bret Swanson

One Step Forward, Two Steps Back

Monday, November 22nd, 2010

The FCC’s apparent about-face on Net Neutrality is really perplexing.

Over the past few weeks it looked like the Administration had acknowledged economic reality (and bipartisan Capitol Hill criticism) and turned its focus to investment and jobs. Outgoing NEC Director Larry Summers and Commerce Secretary Gary Locke announced a vast expansion of available wireless spectrum, and FCC chairman Julius Genachowski used his speech to the NARUC state regulators to encourage innovation and employment. Gone were mentions of the old priorities — intrusive new regulations such as Net Neutrality and Title II reclassification of modern broadband as an old telecom service. Finally, it appeared, an already healthy and vibrant Internet sector could stop worrying about these big new government impositions — and years of likely litigation — and get on with building the 21st century digital infrastructure.

But then came word at the end of last week that the FCC would indeed go ahead with its new Net Neutrality regs. Perhaps even issuing them on December 22, just as Congress and the nation take off for Christmas vacation [the FCC now says it will hold its meeting on December 15]. When even a rare  economic sunbeam is quickly clouded by yet more heavy-handedness from Washington, is it any wonder unemployment remains so high and growth so low?

Any number of people sympathetic to the economy’s and the Administration’s plight are trying to help. Last week David Leonhardt of the New York Times pointed the way, at least in a broad strategic sense: “One Way to Trim the Deficit: Cultivate Growth.” Yes, economic growth! Remember that old concept? Economist and innovation expert Michael Mandel has suggested a new concept of “countercyclical regulatory policy.” The idea is to lighten regulatory burdens to boost growth in slow times and then, later, when the economy is moving full-steam ahead, apply more oversight to curb excesses. Right now, we should be lightening burdens, Mandel says, not imposing new ones:

it’s really a dumb move to monkey with the vibrant and growing communications sector when the rest of the economy is so weak. It’s as if you have two cars — one running, one in the repair shop — and you decide it’s a good time to rebuild the transmission of the car that actually works because you hear a few squeaks.

Apparently, FCC honchos met with interested parties this morning to discuss what comes next. Unfortunately, at a time when we need real growth, strong growth, exuberant growth! (as Mandel would say), the Administration appears to be saddling an economy-lifting reform (wireless spectrum expansion) with leaden regulation. What’s the point of new wireless spectrum if you massively devalue it with Net Neutrality, open access, and/or Title II?

One step forward, two steps back (ten steps back?) is not an exuberant growth and jobs strategy.

International Broadband Comparison, continued

Thursday, October 14th, 2010

New numbers from Cisco allow us to update our previous comparison of actual Internet usage around the world. We think this is a far more useful metric than the usual “broadband connections per 100 inhabitants” used by the OECD and others to compile the oft-cited world broadband rankings.

What the per capita metric really measures is household size. And because the U.S. has more people in each household than many other nations, we appear worse in those rankings. But as the Phoenix Center has noted, if each OECD nation reached 100% broadband nirvana — i.e., every household in every nation connected — the U.S. would actually fall from 15th to 20th. Residential connections per capita is thus not a very illuminating measure.

But look at the actual Internet traffic generated and consumed in the U.S.

The U.S. far outpaces every other region of the world. In the second chart, you can see that in fact only one nation, South Korea, generates significantly more Internet traffic per user than the U.S. This is no surprise. South Korea was the first nation to widely deploy fiber-to-the-x and was also the first to deploy 3G mobile, leading to not only robust infrastructure but also a vibrant Internet culture. The U.S. dwarfs most others.

If the U.S. was so far behind in broadband, we could not generate around twice as much network traffic per user compared to nations we are told far exceed our broadband capacity and connectivity. The U.S. has far to go in a never-ending buildout of its communications infrastructure. But we invest more than other nations, we’ve got better broadband infrastructure overall, and we use broadband more — and more effectively (see the Connectivity Scorecard and The Economist’s Digital Economy rankings) — than almost any other nation.

The conventional wisdom on this one is just plain wrong.

Chronically Critical Broadband Country Comparisons

Friday, March 26th, 2010

With the release of the FCC’s National Broadband Plan, we continue to hear all sorts of depressing stories about the sorry state of American broadband Internet access. But is it true?

International comparisons in such a fast-moving arena as tech and communications are tough. I don’t pretend it is easy to boil down a hugely complex topic to one right answer, but I did have some critical things to say about a major recent report that got way too many things wrong. A new article by that report’s author singled out France as especially more advanced than the U.S. To cut through all the clutter of conflicting data and competing interpretations on broadband deployment, access, adoption, prices, and speeds, however, maybe a simple chart will help.

Here we compare network usage. Not advertised speeds, which are suspect. Not prices which can be distorted by the use of purchasing power parity (PPP). Not “penetration,” which is largely a function of income, urbanization, and geography. No, just simply, how much data traffic do various regions create and consume.

If U.S. networks were so backward — too sparse, too slow, too expensive — would Americans be generating 65% more network traffic per capita than their Western European counterparts?

Berkman’s Broadband Bungle

Tuesday, December 22nd, 2009

Professors at a leading research unit put suspect data into a bad model, fail to include crucial variables, and even manufacture the most central variable to deliver the hoped-for outcome.

Climate-gate? No, call it Berkman’s broadband bungle.

In October, Harvard’s Berkman Center for the Internet and Society delivered a report, commissioned by the Federal Communications Commission, comparing international broadband markets and policies. The report was to be a central component of the Administration’s new national broadband Internet policy, arriving in February 2010.

Just one problem. Actually many problems. The report botched its chief statistical model in half a dozen ways. It used loads of questionable data. It didn’t account for the unique market structure of U.S. broadband. It reversed the arrow of time in its country case studies. It ignored the high-profile history of open access regulation in the U.S. It didn’t conduct the literature review the FCC asked for. It excommunicated Switzerland . . . .

See my critique of this big report on international broadband at RealClearMarkets.

What price, broadband?

Thursday, September 3rd, 2009

See this new paper from economists Rob Shapiro and Kevin Hassett showing how artificial limits on varied pricing of broadband could severely forestall broadband adoption.

To the extent that lower-income and middle-income consumers are required to pay a greater share of network costs, we should expect a substantial delay in achieving universal broadband access. Our simulations suggest that spreading the costs equally among all consumers — the minority who use large amounts of bandwidth and the majority who use very little — will significantly slow the rate of adoption at the lower end of the income scale and extend the life of the digital divide.

If costs are shifted more heavily to those who use the most bandwidth and, therefore, are most responsible for driving up the cost of expanding network capabilities, the digital divergence among the races and among income groups can be eliminated much sooner.

Broadband benefit = $32 billion

Tuesday, July 14th, 2009

We recently estimated the dramatic gains in “consumer bandwidth” — our ability to communicate and take advantage of the Internet. So we note this new study from the Internet Innovation Alliance, written by economists Mark Dutz, Jonathan Orszag, and Robert Willig, that estimates a consumer surplus from U.S. residential broadband Internet access of $32 billion. “Consumer surplus” is the net benefit consumers enjoy, basically the additional value they receive from a product compared to what they pay.

Bandwidth caps: One hundred and one distractions

Thursday, April 30th, 2009

When Cablevision of New York announced this week it would begin offering broadband Internet service of 101 megabits per second for $99 per month, lots of people took notice. Which was the point.

Maybe the 101-megabit product is a good experiment. Maybe it will be successful. Maybe not. One hundred megabits per second is a lot, given today’s applications (and especially given cable’s broadcast tree-and-branch shared network topology). A hundred megabits, for example, could accommodate more than five fully uncompressed high-definition TV channels, or 10+ compressed HD streams. It’s difficult to imagine too many households finding a way today to consume that much bandwidth. Tomorrow is another question. The bottom line is that in addition to making a statement, Cablevision is probably mostly targeting the small business market with this product.

Far more perplexing than Cablevision’s strategy, however, was the reaction from groups like the reflexively critical Free Press:

We are encouraged by Cablevision’s plan to set a new high-speed bar of service for the cable industry. . . . this is a long overdue step in the right direction.

Free Press usually blasts any decision whatever by any network or media company. But by praising the 101-megabit experiment, Free Press is acknowledging the perfect legitimacy of charging variable prices for variable products. Pay more, get more. Pay less, get more affordably the type of service that will meet your needs the vast majority of the time. (more…)

Bandwidth and QoS: Much ado about something

Friday, April 24th, 2009

The supposed top finding of a new report commissioned by the British telecom regulator Ofcom is that we won’t need any QoS (quality of service) or traffic management to accommodate next generation video services, which are driving Internet traffic at consistently high annual growth rates of between 50% and 60%. TelecomTV One headlined, “Much ado about nothing: Internet CAN take video strain says UK study.” 

But the content of the Analysys Mason (AM) study, entitled “Delivering High Quality Video Services Online,” does not support either (1) the media headline — “Much ado about nothing,” which implies next generation services and brisk traffic growth don’t require much in the way of new technology or new investment to accommodate them — or (2) its own “finding” that QoS and traffic management aren’t needed to deliver these next generation content and services.

For example, AM acknowledges in one of its five key findings in the Executive Summary: 

innovative business models might be limited by regulation: if the ability to develop and deploy novel approaches was limited by new regulation, this might limit the potential for growth in online video services.

In fact, the very first key finding says:

A delay in the migration to 21CN-based bitstream products may have a negative impact on service providers that use current bitstream products, as growth in consumption of video services could be held back due to the prohibitive costs of backhaul capacity to support them on the legacy core network. We believe that the timely migration to 21CN will be important in enabling significant take-up of online video services at prices that are reasonable for consumers.

So very large investments in new technologies and platforms are needed, and new regulations that discourage this investment could delay crucial innovations on the edge. Sounds like much ado about something, something very big.  (more…)

Apples and Oranges

Friday, April 10th, 2009

Saul Hansell has done some good analysis of the broadband market (as I noted here), and I’m generally a big fan of the NYT’s Bits blog. But this item mixes cable TV apples with switched Internet oranges. And beyond that just misses the whole concept of products and prices.

Questioning whether Time Warner will be successful in its attempt to cap bandwidth usage on its broadband cable modem service — effectively raising the bandwidth pricing issue — Hansell writes:

I tried to explore the marginal costs with [Time Warner's] Mr. Hobbs. When someone decides to spend a day doing nothing but downloading every Jerry Lewis movie from BitTorrent, Time Warner doesn’t have to write a bigger check to anyone. Rather, as best as I can figure it, the costs are all about building the network equipment and buying long-haul bandwidth for peak capacity.

If that is true, the question of what is “fair” is somewhat more abstract than just saying someone who uses more should pay more. After all, people who watch more hours of cable television don’t pay more than those who don’t.

It’s also true that a restaurant patron who finishes his meal doesn’t pay more than someone who leaves half the same menu item on his plate. If he orders two bowls of soup, he gets more soup. He can’t order one bowl of soup and demand each of his five dining partners also be served for free. Pricing decisions depend upon the product and the granularity that is being offered. (more…)

Broadband bridges to where?

Wednesday, April 1st, 2009

See my new commentary on the new $7.2 billion broadband program in the Federal stimulus bill. I conclude that if we’re going to spend taxpayer money at all, we should take advantage of local knowledge:

Many states have already pinpointed the areas most in need of broadband infrastructure. Local companies and entrepreneurs are likely to best know where broadband needs to be deployed – and to aggressively deliver it with the most appropriate, cost-effective technology that meets the needs of the particular market. Using the  states as smart conduits is also likely to get the money to network builders more quickly.

And that

After falling seriously behind foreign nations in broadband and in our favored measure of “bandwidth-per-capita” in the early 2000s, the U.S. got its act together and is now on the right path. In the last decade, total U.S. broadband lines grew from around 5 million to over 120 million, while residential broadband grew from under 5 million to 75 million. By far the most important broadband policy point is not to discourage or distort the annual $60+ billion that private companies already invest.

Rare reason in the broadband debate

Thursday, March 12th, 2009

Calm and reasoned discussion in debates over broadband and Internet policy are rare. But Saul Hansell, in a series of posts at the NYTimes Bits blog, does an admirable job surveying international broadband comparisons. Here are parts I and II, with part III on the way. [Update: Here's part III. And here's a good previous post on "broadband stimulus."]

So far Hansell has asked two basic questions: Why is theirs faster? And why is theirs cheaper? “Theirs” being non-American broadband.

His answers: “Their” broadband is not too much faster than American broadband, at least not anymore. And their broadband is cheaper for a complicated set of reasons, but mostly because of government price controls that could hurt future investment and innovation in those nations that practice it. 

Ask America. We already tried it. But more on that later.

Hansell makes several nuanced points: (1) broadband speeds depend heavily on population density. The performance and cost of communications technologies are distance-sensitive. It’s much cheaper to deliver fast speeds in Asia’s big cities and Europe’s crowded plains than across America’s expanse. (2) Hansell also points to studies showing some speed-inflation in Europe and Asia. In other words, advertised speeds are often overstated. But most importantly, (3) Hansell echoes my basic point over the last couple years:

. . . Internet speeds in the United States are getting faster. Verizon is wiring half its territory with its FiOS service, which strings fiber optic cable to people’s homes. FiOS now offers 50 Mbps service and has the capacity to offer much faster speeds. As of the end of 2008, 4.1 million homes in the United States had fiber service, which puts the United States right behind Japan, which has brought fiber directly to 8.2 million homes, according to the Fiber to the Home Council. Much of what is called fiber broadband in Korea, Sweden and until recently Japan, only brings the fiber to the basement of apartment buildings or street-corner switch boxes.

AT&T is building out that sort of network for its U-Verse service, running fiber to small switching stations in neighborhoods, so that it can offer much faster DSL with data speed of up to 25 Mbps and and Internet video as well. And cable systems, which cover more than 90 percent of the country, are starting to deploy the next generation of Internet technology called Docsis 3.0. It can offer speeds of 50 Mbps. . . .

(more…)