Category Archives: Net Neutrality

“Net Neutrality and Antitrust,” a House committee hearing

On Wednesday, a House Judiciary subcommittee heard testimony on the potential for existing general-purpose antitrust, competition, and consumer protection laws to police the Internet. Until the Federal Communications Commission (FCC) issued its 2015 Title II Order, the Federal Trade Commission (FTC) oversaw these functions. The 2015 rule upended decades worth of successful policy, but now that the new FCC is likely to return the Internet to its original status as a Title I information service, Title II advocates are warning that general purpose law and the FTC are not equipped to deal with the Internet. They’re also hoping that individual states enter the Internet regulation game. I think they are wrong on both counts.

In fact, it’s more important than ever that we govern the sprawling Internet with general purpose laws and economic principles, not the outdated, narrow, vertical silos of a 1934 monopoly telephone law. And certainly not a patchwork of conflicting state laws. The Internet is not just “modernized telephone wires.” It is a broad and deep ecosystem of communications and computing infrastructure; vast, nested layers of software, applications, and content; and increasingly varied services connecting increasingly diverse end-points and industries. General purpose rules are far better suited to this environment than the 80-year old law written to govern one network, built and operated by one company, to deliver one service.

Over the previous two decades of successful operation under Title I, telecom, cable, and mobile firms in the U.S. invested a total of $1.5 trillion in wired and wireless broadband networks. But over the last two years, since the Title II Order, the rate of investment has slowed. In 2014, the year before the Title II Order, U.S. broadband investment was $78.4 billion, but in 2016 that number had dropped by around 3%, to $76 billion. In the past, annual broadband investment had only dropped during recessions.

This is a concern because massive new investments are needed to fuel the next waves of Internet innovation. If we want to quickly and fully deploy new 5G wireless networks over the coming 15 years, for example, we need to extend fiber optic networks deeper into neighborhoods and more broadly across the nation in order to connect millions of new “small cells” that will not only deliver ever more video to our smartphones but also enable autonomous vehicles and the Internet of Things. It’s a project that may cost $250-300 billion, but it would happen far more slowly under Title II, and many marginal investments in marginal geographies might never happen at all.

At the hearing, FTC Commissioner Terrell McSweeny defended the 2015 Title II order, which poached many oversight functions from her own agency. Her reasoning was odd, however. She said that we needed to radically change policy in order to preserve the healthy results of previously successful policy. She said the Internet’s success depended on its openness, and we could sustain that openness only by applying the old telephone regulations, for the first time, to the Internet.

This gets things backwards. In our system, we usually intervene in markets and industries only if we demonstrate both serious and widespread market failures and if we think a policy can deliver clear improvements compared to its possible downside. In other words, the burden is on the government to prove harm and that it can better manage an industry. The demonstrable success of the Internet made this a tough task for the FCC. In the end, the FCC didn’t perform a market analysis, didn’t show market failures or consumer harm, didn’t show market power, and didn’t perform a cost-benefit analysis of its aggressive new policy. It simply asserted that it knew better how to manage the technology and business of the Internet, compared to engineers and entrepreneurs who had already created one of history’s biggest economic and technical successes.

Commissioner McSweeny also disavowed what had been one of the FTC’s most important new-economy functions and one in which it had developed a good bit of expertise – digital privacy. Under the Title II Order, the FCC snatched from the FTC the power to regulate Internet service providers (ISPs) on matters of digital privacy. Now that the FCC looks to be returning that power to the FTC, however, some states are attempting to regulate Internet privacy themselves. This summer, for example, California legislators tried to impose the Title II Order’s privacy rules on ISPs. Although that bill didn’t pass, you can bet California and other states will be back.

It’s important, therefore, that the FCC reaffirm longstanding U.S. policy – that the Internet is the ultimate form of interstate commerce. Here’s the way we put it in a recent post:

The internet blew apart the old ways of doing things. Internet access and applications are inherently nonlocal services. In this sense, the “cloud” analogy is useful. Telephones used to be registered to a physical street address. Today’s mobile devices go everywhere. Data, services, and apps are hosted in the cloud at multiple locations and serve end users who could be anywhere — likewise for peer-to-peer applications, which connect individual users who are mobile. Along most parameters, it makes no sense to govern the internet locally. Can you imagine 50 different laws governing digital privacy or net neutrality? It would be confusing at best, but more likely debilitating.

The Democratic FCC Chairman Bill Kennard weighed in on this matter in the late 1990s. He was in the middle of the original debate over broadband and argued firmly that high-speed cable modems were subject to a national policy of “unregulation” and should not be swept into the morass of legacy regulation.

In a 1999 speech, he admonished those who would seek to regulate broadband at the local or state level:

“Unfortunately, a number of local franchising authorities have decided not to follow this de-regulatory, pro-competitive approach. Instead, they have begun imposing their own local open access provisions. As I’ve said before, it is in the national interest that we have a national broadband policy. The FCC has the authority to set one, and we have. We have taken a de-regulatory approach, an approach that will let this nascent industry flourish. Disturbed by the effect that the actions of local franchising authorities could have on this policy and on the deployment of broadband, I have asked our general counsel to prepare a brief to be filed in the pending Ninth Circuit case so we can explain to the court why it’s important that we have a national policy.”

In the coming months, the FCC will likely reclassify the internet as a Title I information service. In addition to freeing broadband and mobile from the regulatory straitjacket of the 2015 Title II Order, this will also return oversight responsibility for digital privacy to the Federal Trade Commission (FTC), its natural home. The FTC has spent the last decade developing rules governing this important and growing arena and has enforced those rules to protect consumers. States’ efforts to impose their own layer of possibly contradictory rules would only confuse consumers and discourage upstart innovators.

As the internet becomes an ever more important component of all that we do, as its complexity spreads, and as it touches more parts of the economy, this principle will only become more important. Yes, there will be legitimate debates over just where to draw the boundaries. As the internet seeps further into every economic and social act, this does not mean that states will lose all power to govern. But to the extent that Congress, the FCC, and the FTC have the authority to protect the free flow of internet activity against state-based obstacles and fragmentation, they should do so. In its coming order, the FCC should reaffirm the interstate nature of these services.

A return to the Internet’s original status as a Title I information service, protected from state-based fragmentation, merely extends and strengthens the foundation upon which the U.S. invented and built the modern information economy.

Permission Slips for Internet Innovation

Screen Shot 2015-08-15 at 1.03.35 PM

See my commentary in The Wall Street Journal this weekend — “Permission Slips for Internet Innovation.”

Screen Shot 2015-08-16 at 1.06.49 PM

Continue reading . . .

From the Hollywood archives: Sony questions FCC Internet regulation

Although lots of firms sat out the public debate over Net Neutrality, we’ve learned that many of them strenuously opposed the FCC’s new Internet regulations behind the scenes. The latest example is Sony, which, according to this Daily Caller story, warned that Title II Internet regulation “might put up roadblocks on how we distribute content.” Plumbing internal emails now available because of the notorious Sony hack, DC found a number of private complaints about the FCC. Sony Pictures Entertainment chief technology officer Spencer Stephens, for example, was adamant:

“The Internet has drawn investment precisely because it isn’t a utility,” Stephens wrote. “My expectation is that prioritized services will mean investment in infrastructure which would expand the size of the pipe.”

Responding to Netflix’s assertions that interconnection disagreements compelled the FCC to enact sweeping regulation, Stephens wrote that “their claims that they have been held to ransom are, IMHO, complete BS.”

More here.

Netflix, Mozilla, Google recant on Net Neutrality

Dept. of You Can’t Make This Stuff Up:

Three of the driving forces behind the 10-year effort to regulate the Internet — Netflix, Mozilla, and Google — have, in the last few days and in their own ways, all recanted their zealous support of Net Neutrality. It may have been helpful to have this information . . . before last week, when the FCC plunged the entire Internet industry into a years-long legal war.

First, on Monday, Netflix announced it had entered into a “sponsored data” deal with an Australian ISP, which violates the principles of “strong Net Neutrality,” Netflix’s preferred and especially robust flavor of regulation.

Then on Wednesday, Netflix CFO David Wells, speaking at an investor conference, said

“Were we pleased it pushed to Title II? Probably not,” Wells said at the conference. “We were hoping there might be a non-regulated solution.”

At this week’s huge Mobile World Congress in Barcelona, meanwhile, my AEI colleague Jeff Eisenach reported via Twitter that a Mozilla executive had backtracked:

Mozilla’s Dixon-Thayer is latest #netneutrality advocate to backpedal – “we don’t necessarily favor regulation” #repealtitleII #MWC15MP
3/4/15, 10:44 AM

Add these to the revelations about Google’s newfound reticence. Several weeks ago, in The Wall Street Journal‘s blockbuster exposé, we found out that Google Chairman Eric Schmidt called the White House to protest President Obama’s surprise endorsement of Title II regulation of the Internet. Then, just days before the February 26 vote at the FCC, Google urgently pleaded that the Commission remove the bizarre new regulatory provision known as broadband subscriber access service (BSAS), which would have created out of thin air a hereto unknown “service” between websites and ISP consumers — in order to regulate that previously nonexistent service. (Ironic, yes, that this BSAS provision was dreamt up by . . . Mozilla.) Google was successful, just 48 hours before the vote, in excising this menacing regulation of a phantom service. But Google and the others are waking up to the fact that Title II and broad Section 706 authority might contain more than a few nasty surprises.

Fred Campbell examined Netflix’s statements over the last year and concluded: “Netflix bluffed. And everybody lost.”

And Yet . . .

The bottom line of these infuriating reversals may actually be a positive for the Internet. These epiphanies — “Holy bit, we just gave the FCC the power do do what!?!” — may wake serious people from the superficial slumber of substance-free advocacy. The epiphanies may give new life to efforts in Congress to find a legislative compromise that would prohibit clear bad behavior (blocking, throttling, etc.) but which would also circumscribe the FCC’s regulatory ambitions and thus allow the Internet to continue on its mostly free and unregulated — and hugely successful — path.

The last refuge of Internet regulators: the theory of the “terminating access monopoly”

Net neutrality activists have deployed a long series of rationales in their quest for government control of the Internet. As each rationale is found wanting, they simply move onto the next, more exotic theory. The debate has gone on so long that they’ve even begun recycling through old theories that were discredited long ago.

In the beginning, the activists argued that there should be no pay for performance anywhere on the Net. We pointed out the most obvious example of a harmful consequence of their proposal: their rules, as originally written, would have banned widely used content delivery networks (CDNs), which speed delivery of packets (for a price).

Then they argued that without strong government rules, broadband service providers would block innovation at the “edges” of the network. But for the last decade, under minimal regulation, we’ve enjoyed an explosion of new technologies, products, and services from content and app firms like YouTube, Facebook, Netflix, Amazon, Twitter, WhatsApp, Etsy, Snapchat, Pinterest, Twitch, and a thousand others. Many of these firms have built businesses worth billions of dollars.

They said we needed new rules because the light-touch regulatory environment had left broadband in the U.S. lagging its international rivals, whose farsighted industrial policies had catapulted them far ahead of America. Oops. Turns out, the U.S. leads the world in broadband. (See my colleague Richard Bennett’s detailed report on global broadband and my own.)

Then they argued that, regardless of how well the U.S. is doing, do you really trust a monopoly to serve consumer needs? We need to stop the broadband monopolist — the cable company. Turns out most Americans have several choices in broadband providers, and the list of options is growing — see FiOS, U-verse, Google Fiber, satellite, broadband wireless from multiple carriers, etc. No, broadband service is not like peanut butter. Because of the massive investments required to build networks, there will never be many dozens of wires running to each home. But neither is broadband a monopoly.

Artificially narrowing the market is the first refuge of nearly all bureaucrats concerned with competition. It’s an easy way to conjure a monopoly in almost any circumstance. My favorite example was the Federal Trade Commission’s initial opposition in 2003 to the merger of Haagen-Dasz (Nestle) and Godiva (Dreyer’s). The government argued it would harmfully reduce competition in the market for “super premium ice cream.” The relevant market, in the agency’s telling, wasn’t food, or desserts, or sweets, or ice cream, or even premium ice cream, but super premium ice cream.


The U.S. Leads the World in Broadband

See our Wall Street Journal op-ed from December 8, which summarizes our new research on Internet traffic and argues for a continued policy of regulatory humility for the digital economy.

Continue reading here. Or read the text below the fold . . . 


How can U.S. broadband lag if it generates 2-3 times the traffic of other nations?

Is the U.S. broadband market healthy or not? This question is central to the efforts to change the way we regulate the Internet. In a short new paper from the American Enterprise Institute, we look at a simple way to gauge whether the U.S. has in fact fallen behind other nations in coverage, speed, and price . . . and whether consumers enjoy access to content. Here’s a summary:

  • Internet traffic volume is an important indicator of broadband health, as it encapsulates and distills the most important broadband factors, such as access, coverage, speed, price, and content availability.
  • US Internet traffic — a measure of the nation’s “digital output” — is two to three times higher than most advanced nations, and the United States generates more Internet traffic per capita and per Internet user than any major nation except for South Korea.
  • The US model of broadband investment and innovation — which operates in an environment that is largely free from government interference — has been a dramatic success.
  • Overturning this successful policy by imposing heavy regulation on the Internet puts one of America’s most vital industries at risk.

Interconnection: Arguing for Inefficiency

Last week Level 3 posted some new data from interconnection points with three large broadband service providers. The first column of the chart, with data from last spring, shows lots of congestion between Level 3 and the three BSPs. You might recall the battles of last winter and early spring when Netflix streaming slowed down and it accused Comcast and other BSPs of purposely “throttling” its video traffic. (We wrote about the incident here, here, here, and here.)

The second column of the Level 3 chart, with data from September, shows that traffic with two of the three BSPs is much less congested today. Level 3 says, reasonably, the cause for the change is Netflix’s on-net transit (or paid peering) agreements with Comcast and (presumably) Verizon, in which Netflix and the broadband firms established direct connections with one another. As Level 3 writes, “You might say that it’s good news overall.” And it is: these on-net transit agreements, which have been around for at least 15 years, and which are used by Google, Amazon, Microsoft, all the content delivery networks (CDNs), and many others, make the Net work better and more efficiently, cutting costs for content providers and delivering better, faster, more robust services to consumers.

But Level 3 says despite this apparent improvement, the data really shows the broadband providers demanding “tolls” and that this is bad for the Internet overall. It thinks Netflix and the broadband providers should be forced to employ an indirect A–>B–>C architecture even when a direct A–>C architecture is more efficient.

The Level 3 charts make another probably unintended point. Recall that Netflix, starting around two years ago, began building its own CDN called OpenConnect. Its intention was always to connect directly to the broadband providers (A–>C) and to bypass Level 3 and other backbone providers (B). This is exactly what happened. Netflix connected to Comcast, Verizon, and others (although for a small fee, rather than for free, as it had hoped). And it looks like the broadband providers were smart not to build out massive new interconnection capacity with Level 3 to satisfy a peering agreement that was out of balance, and which, as soon as Netflix left, regained balance. It would have been a huge waste (what they used to call stranded investment).

Twitch Proves the Net Is Working

Below find our Reply Comments in the Federal Communications Commission’s Open Internet proceeding:

September 15, 2014

Twitch Proves the Net Is Working

On August 25, 2014, Amazon announced its acquisition of Twitch for around $1 billion. Twitch  ( is a young but very large website that streams video games and the gamers who play them. The rise of Twitch demonstrates the Net is working and, we believe, also deals a severe blow to a central theory of the Order and NPRM.

The NPRM repeats the theory of the 2010 Open Internet Order that “providers of broadband Internet access service had multiple incentives to limit Internet openness.” The theory advances a concern that small start-up content providers might be discouraged or blocked from opportunities to grow. Neither the Order nor the current NPRM considers or even acknowledges evidence or arguments to the contrary — that broadband service providers (BSPs) may have substantial incentives to promote Internet openness. Nevertheless, the Commission now helpfully seeks comment “to update the record to reflect marketplace, technical, and other changes since the 2010 Open Internet Order was adopted that may have either exacerbated or mitigated broadband providers’ incentives and ability to limit Internet openness. We seek general comment on the Commission’s approach to analyzing broadband providers’ incentives and ability to engage in practices that would limit the open Internet.”

The continued growth of the Internet, and the general health of the U.S. Web, content, app, device, and Internet services markets — all occurring in the absence of Net Neutrality regulation — more than mitigate the Commission’s theory of BSP incentives. While there is scant evidence for the theory of bad BSP behavior, there is abundant evidence that openness generally benefits all players throughout the Internet value chain. The Commission cannot ignore this evidence.

The rise of Twitch is a perfect example. In three short years, Twitch went from brand new start-up to the fourth largest single source of traffic on the Internet. Google had previously signed a term sheet with Twitch, but so great was the momentum of this young, tiny company, that it could command a more attractive deal from Amazon. At the time of its acquisition by Amazon, Twitch said it had 55 million unique monthly viewers (consumers) and more than one million broadcasters (producers), generating 15 billion minutes of content viewed a month. According to measurements by the network scientist and Deepfield CEO Craig Labovitz, only Netflix, Google’s YouTube, and Apple’s iTunes generate more traffic.


A Decade Later, Net Neutrality Goes To Court

Today the D.C. Federal Appeals Court hears Verizon’s challenge to the Federal Communications Commission’s “Open Internet Order” — better known as “net neutrality.”

Hard to believe, but we’ve been arguing over net neutrality for a decade. I just pulled up some testimony George Gilder and I prepared for a Senate Commerce Committee hearing in April 2004. In it, we asserted that a newish “horizontal layers” regulatory proposal, then circulating among comm-law policy wonks, would become the next big tech policy battlefield. Horizontal layers became net neutrality, the Bush FCC adopted the non-binding Four Principles of an open Internet in 2005, the Obama FCC pushed through actual regulations in 2010, and now today’s court challenge, which argues that the FCC has no authority to regulate the Internet and that, in fact, Congress told the FCC not to regulate the Internet.

Over the years we’ve followed the debate, and often weighed in. Here’s a sampling of our articles, reports, reply comments, and even some doggerel:

— Bret Swanson

Net ‘Neutrality’ or Net Dynamism? Easy Choice.

Consumers beware. A big content company wants to help pay for the sports you love to watch.

ESPN is reportedly talking with one or more mobile service providers about a new arrangement in which the sports giant might agree to pay the mobile providers so that its content doesn’t count against a subscriber’s data cap. People like watching sports on their mobile devices, but web video consumes lots of data and is especially tough on bandwidth-constrained mobile networks. The mobile providers and ESPN have noticed usage slowing as consumers approach their data subscription ceilings, after which they are commonly charged overage fees. ESPN doesn’t like this. It wants people to watch as much as possible. This is how it sells advertising. ESPN wants to help people watch more by, in effect, boosting the amount of data a user may consume — at no cost to the user.

As good a deal as this may be for consumers (and the companies involved), the potential arrangement offends some people’s very particular notion of “network neutrality.” They often have trouble defining what they mean by net neutrality, but they know rule breakers when they see them. Sure enough, long time net neutrality advocate Public Knowledge noted, “This is what a network neutrality violation looks like.”

The basic notion is that all bits on communications networks should be treated the same. No prioritization, no discrimination, and no partnerships between content companies and conduit companies. Over the last decade, however, as we debated net neutrality in great depth and breadth, we would point out that such a notional rule would likely result in many perverse consequences. For example, we noted that, had net neutrality existed at the time, the outlawing of pay-for-prioritization would have banned the rise of content delivery networks (CDNs), which have fundamentally improved the user experience for viewing online content. When challenged in this way, the net neutrality proponents would often reply, Well, we didn’t mean that. Of course that should be allowed. We also would point out that yesterday’s and today’s networks discriminate among bits in all sorts of ways, and that we would continue doing so in the future. Their arguments often deteriorated into a general view that Bad things should be banned. Good things should be allowed. And who do you think would be the arbiter of good and evil? You guessed it.

So what is the argument in the case of ESPN? The idea that ESPN would pay to exempt its bits from data caps apparently offends the abstract all-bits-equal notion. But why is this bad in concrete terms? No one is talking about blocking content. In fact, by paying for a portion of consumers’ data consumption, such an arrangement can boost consumption and consumer choice. Far from blocking content, consumers will enjoy more content. Now I can consume my 2 gigabytes of data plus all the ESPN streaming I want. That’s additive. And if I don’t watch ESPN, then I’m no worse off. But if the mobile company were banned from such an arrangement, it may be forced to raise prices for everyone. Now, because ESPN content is popular and bandwidth-hungry, I, especially as an ESPN non-watcher, am worse off.

So the critics’ real worry is, I suppose, that ESPN, by virtue of its size, could gain an advantage on some other sports content provider who chose not to offer a similar uncapped service. But this is NOT what government policy should be — the micromanagement of prices, products, the structure of markets, and relationships among competitive and cooperative firms. This is what we warned would happen. This is what we said net neutrality was really all about — protecting some firms and punishing others. Where is the consumer in this equation?

These practical and utilitarian arguments about technology and economics are important. Yet they ignore perhaps the biggest point of all: the FCC has no authority to regulate the Internet. The Internet is perhaps the greatest free-flowing, fast-growing, dynamic engine of cultural and economic value we’ve known. The Internet’s great virtue is its ability to change and grow, to foster experimentation and innovation. Diversity in networks, content, services, apps, and business models is a feature, not a bug. Regulation necessarily limits this freedom and diversity, making everything more homogeneous and diminishing the possibilities for entrepreneurship and innovation. Congress has given the FCC no authority to regulate the Internet. The FCC invented this job for itself and is now being challenged in court.

Possible ESPN-mobile partnerships are just the latest reminder of why we don’t want government limiting our choices — and all the possibilities — on the Internet.

— Bret Swanson

FCC’s 706 Broadband Report Does Not Compute

Yesterday the Federal Communications Commission issued 181 pages of metrics demonstrating, to any fair reader, the continuing rapid rise of the U.S. broadband economy — and then concluded, naturally, that “broadband is not yet being deployed to all Americans in a reasonable and timely fashion.” A computer, being fed the data and the conclusion, would, unable to process the logical contradictions, crash.

The report is a response to section 706(b) of the 1996 Telecom Act that asks the FCC to report annually whether broadband “is being deployed . . . in a reasonable and timely fashion.” From 1999 to 2008, the FCC concluded that yes, it was. But now, as more Americans than ever have broadband and use it to an often maniacal extent, the FCC has concluded for the third year in a row that no, broadband deployment is not “reasonable and timely.”

The FCC finds that 19 million Americans, mostly in very rural areas, don’t have access to fixed line terrestrial broadband. But Congress specifically asked the FCC to analyze broadband deployment using any technology.”

“Any technology” includes DSL, cable modems, fiber-to-the-x, satellite, and of course fixed wireless and mobile. If we include wireless broadband, the unserved number falls to 5.5 million from the FCC’s headline 19 million. Five and a half million is 1.74% of the U.S. population. Not exactly a headline-grabbing figure.

Even if we stipulate the FCC’s framework, data, and analysis, we’re still left with the FCC’s own admission that between June 2010 and June 2011, an additional 7.4 million Americans gained access to fixed broadband service. That dropped the portion of Americans without access to 6% in 2011 from around 8.55% in 2010 — a 30% drop in the unserved population in one year. Most Americans have had broadband for many years, and the rate of deployment will necessarily slow toward the tail-end of any build-out. When most American households are served, there just aren’t very many to go, and those that have yet to gain access are likely to be in the very most difficult to serve areas (e.g. “on tops of mountains in the middle of nowhere”). The fact that we still added 7.4 million broadband in the last year, lowering the unserved population by 30%, even using the FCC’s faulty framework, demonstrates in any rational world that broadband “is being deployed” in a “reasonable and timely fashion.”

But this is not the rational world — it’s D.C. in the perpetual political silly season.

One might conclude that because the vast majority of these unserved Americans live in very rural areas — Alaska, Montana, West Virginia — the FCC would, if anything, suggest policies tailored to boost infrastructure investment in these hard-to-reach geographies. We could debate whether these are sound investments and whether the government would do a good job expanding access, but if rural deployment is a problem, then presumably policy should attempt to target and remediate the rural underserved. Commissioner McDowell, however, knows the real impetus for the FCC’s tortured no-confidence vote — its regulatory agenda.

McDowell notes that the report repeatedly mentions the FCC’s net neutrality rules (now being contested in court), which are as far from a pro-broadband policy, let alone a targeted one, as you could imagine. If anything, net neutrality is an impediment to broader, faster, better broadband. But the FCC is using its thumbs-down on broadband deployment to prop up its intrusions into a healthy industry. As McDowell concluded, “the majority has used this process as an opportunity to create a pretext to justify more regulation.”

Akamai CEO Exposes FCC’s Confused “Paid Priority” Prohibition

In the wake of the FCC’s net neutrality Order, published on December 23, several of us have focused on the Commission’s confused and contradictory treatment of “paid prioritization.” In the Order, the FCC explicitly permits some forms of paid priority on the Internet but strongly discourages other forms.

From the beginning — that is, since the advent of the net neutrality concept early last decade — I argued that a strict neutrality regime would have outlawed, among other important technologies, CDNs, which prioritized traffic and made (make!) the Web video revolution possible.

So I took particular notice of this new interview (sub. required) with Akamai CEO Paul Sagan in the February 2011 issue of MIT’s Technology Review:

TR: You’re making copies of videos and other Web content and distributing them from strategic points, on the fly.

Paul Sagan: Or routes that are picked on the fly, to route around problematic conditions in real time. You could use Boston [as an analogy]. How do you want to cross the Charles to, say, go to Fenway from Cambridge? There are a lot of bridges you can take. The Internet protocol, though, would probably always tell you to take the Mass. Ave. bridge, or the BU Bridge, which is under construction right now and is the wrong answer. But it would just keep trying. The Internet can’t ever figure that out — it doesn’t. And we do.

There it is. Akamai and other content delivery networks (CDNs), including Google, which has built its own CDN-like network, “route around” “the Internet,” which “can’t ever figure . . . out” the fastest path needed for robust packet delivery. And they do so for a price. In other words: paid priority. Content companies, edge innovators, basement bloggers, and poor non-profits who don’t pay don’t get the advantages of CDN fast lanes. (more…)

Did the FCC order get lots worse in last two weeks?

So, here we are. Today the FCC voted 3-2 to issue new rules governing the Internet. I expect the order to be struck down by the courts and/or Congress. Meantime, a few observations:

  • The order appears to be more intrusive on the topic of “paid prioritization” than was Chairman Genachowski’s outline earlier this month. (Keep in mind, we haven’t seen the text. The FCC Commissioners themselves only got access to the text at 11:42 p.m. last night.)
  • If this is true, if the “nondiscrimination” ban goes further than a simple reasonableness test, which itself would be subject to tumultuous legal wrangling, then the Net Neutrality order could cause more problems than I wrote about in this December 7 column.
  • A prohibition or restriction on “paid prioritization” is a silly rule that belies a deep misunderstanding of how our networks operate today and how they will need to operate tomorrow. Here’s how I described it in recent FCC comments:

In September 2010, a new network company that had operated in stealth mode digging ditches and boring tunnels for the previous 24 months, emerged on the scene. As Forbes magazine described it, this tiny new company, Spread Networks

“spent the last two years secretly digging a gopher hole from Chicago to New York, usurping the erstwhile fastest paths. Spread’s one-inch cable is the latest weapon in the technology arms race among Wall Street houses that use algorithms to make lightning-fast trades. Every day these outfits control bigger stakes of the markets – up to 70% now. “Anybody pinging both markets  has to be on this line, or they’re dead,” says Jon A. Najarian, cofounder of OptionMonster, which tracks high-frequency trading.

“Spread’s advantage lies in its route, which makes nearly a straight line from a data center  in Chicago’s South Loop to a building across the street from Nasdaq’s servers in Carteret, N.J. Older routes largely follow railroad rights-of-way through Indiana, Ohio and Pennsylvania. At 825 miles and 13.3 milliseconds, Spread’s circuit shaves 100 miles and 3 milliseconds off of the previous route of lowest latency, engineer-talk for length of delay.”

Why spend an estimated $300 million on an apparently duplicative route when numerous seemingly similar networks already exist? Because, Spread says, three milliseconds matters.

Spread offers guaranteed latency on its dark fiber product of no more than 13.33 milliseconds. Its managed wave product is guaranteed at no more than 15.75 milliseconds. It says competitors’ routes between Chicago and New York range from 16 to 20 milliseconds. We don’t know if Spread will succeed financially. But Spread is yet another demonstration that latency is of enormous and increasing importance. From entertainment to finance to medicine, the old saw is truer than ever: time is money. It can even mean life or death.

A policy implication arises. The Spread service is, of course, a form a “paid prioritization.” Companies are paying “eight to 10 times the going rate” to get their bits where they want them, when they want them.5 It is not only a demonstration of the heroic technical feats required to increase the power and diversity of our networks. It is also a prime example that numerous network users want to and will pay money to achieve better service.

One way to achieve better service is to deploy more capacity on certain links. But capacity is not always the problem. As Spread shows, another way to achieve better service is to build an entirely new 750-mile fiber route through mountains to minimize laser light delay. Or we might deploy a network of server caches that store non-realtime data closer to the end points of networks, as many Content Delivery Networks (CDNs) have done. But when we can’t build a new fiber route or store data – say, when we need to get real-time packets from point to point over the existing network – yet another option might be to route packets more efficiently with sophisticated QoS technologies. Each of these solutions fits a particular situation. They take advantage of, or submit to, the technological and economic trade-offs of the moment or the era. They are all legitimate options. Policy simply must allow for the diversity and flexibility of technical and economic options – including paid prioritization – needed to manage networks and deliver value to end-users.

Depending on how far the FCC is willing to take these misguided restrictions, it could actually lead to the very outcomes most reviled by “open Internet” fanatics — that is, more industry concentration, more “walled gardens,” more closed networks. Here’s how I described the possible effect of restrictions on the important voluntary network management tools and business partnerships needed to deliver robust multimedia services:

There has also been discussion of an exemption for “specialized services.” Like wireless, it is important that such specialized services avoid the possible innovation-sapping effects of a Net Neutrality regulatory regime. But the Commission should consider several unintended consequences of moving down the path of explicitly defining, and then exempting, particular “specialized” services while choosing to regulate the so-called “basic,” “best-effort,” or “entry level” “open Internet.”

Regulating the “basic” Internet but not “specialized” services will surely push most of the network and application innovation and investment into the unregulated sphere. A “specialized” exemption, although far preferable to a Net Neutrality world without such an exemption, would tend to incentivize both CAS providers and ISPs service providers to target the “specialized” category and thus shrink the scope of the “open Internet.”

In fact, although specialized services should and will exist, they often will interact with or be based on the “basic” Internet. Finding demarcation lines will be difficult if not impossible. In a world of vast overlap, convergence, integration, and modularity, attempting to decide what is and is not “the Internet” is probably futile and counterproductive. The very genius of the Internet is its ability to connect to, absorb, accommodate, and spawn new networks, applications and services. In a great compliment to its virtues, the definition of the Internet is constantly changing. Moreover, a regime of rigid quarantine would not be good for consumers. If a CAS provider or ISP has to build a new physical or logical network, segregate services and software, or develop new products and marketing for a specifically defined “specialized” service, there would be a very large disincentive to develop and offer simple innovations and new services to customers over the regulated “basic” Internet. Perhaps a consumer does not want to spend the extra money to jump to the next tier of specialized service. Perhaps she only wants the service for a specific event or a brief period of time. Perhaps the CAS provider or ISP can far more economically offer a compelling service over the “basic” Internet with just a small technical tweak, where a leap to a full-blown specialized service would require more time and money, and push the service beyond the reach of the consumer. The transactions costs of imposing a “specialized” quarantine would reduce technical and economic flexibility on both CAS providers and ISPs and, most crucially, on consumers.

Or, as we wrote in our previous Reply Comments about a related circumstance, “A prohibition of the voluntary partnerships that are likely to add so much value to all sides of the market – service provider, content creator, and consumer – would incentivize the service provider to close greater portions of its networks to outside content, acquire more content for internal distribution, create more closely held ‘managed services’ that meet the standards of the government’s ‘exclusions,’ and build a new generation of larger, more exclusive ‘walled gardens’ than would otherwise be the case. The result would be to frustrate the objective of the proceeding. The result would be a less open Internet.”

It is thus possible that a policy seeking to maintain some pure notion of a basic “open Internet” could severely devalue the open Internet the Commission is seeking to preserve.

All this said, the FCC’s legal standing is so tenuous and this order so rooted in reasoning already rejected by the courts, I believe today’s Net Neutrality rule will be overturned. Thus despite the numerous substantive and procedural errors committed on this “darkest day of the year,” I still expect the Internet to “survive and thrive.”

The Internet Survives, and Thrives, For Now

See my analysis of the FCC’s new “net neutrality” policy at RealClearMarkets:

Despite the Federal Communications Commission’s “net neutrality” announcement this week, the American Internet economy is likely to survive and thrive. That’s because the new proposal offered by FCC chairman Julius Genachowski is lacking almost all the worst ideas considered over the last few years. No one has warned more persistently than I against the dangers of over-regulating the Internet in the name of “net neutrality.”

In a better world, policy makers would heed my friend Andy Kessler’s advice to shutter the FCC. But back on earth this new compromise should, for the near-term at least, cap Washington’s mischief in the digital realm.

. . .

The Level 3-Comcast clash showed what many of us have said all along: “net neutrality” was a purposely ill-defined catch-all for any grievance in the digital realm. No more. With the FCC offering some definition, however imperfect, businesses will now mostly have to slug it out in a dynamic and tumultuous technology arena, instead of running to the press and politicians.

FCC Proposal Not Terrible. Internet Likely to Survive and Thrive.

The FCC appears to have taken the worst proposals for regulating the Internet off the table. This is good news for an already healthy sector. And given info-tech’s huge share of U.S. investment, it’s good news for the American economy as a whole, which needs all the help it can get.

In a speech this morning, FCC chair Julius Genachowski outlined a proposal he hopes the other commissioners will approve at their December 21 meeting. The proposal, which comes more than a year after the FCC issued its Notice of Proposed Rule Making into “Preserving the Open Internet,” appears mostly to codify the “Four Principles” that were agreed to by all parties five years ago. Namely:

  • No blocking of lawful data, websites, applications, services, or attached devices.
  • Transparency. Consumers should know what the services and policies of their providers are, and what they mean.
  • A prohibition of “unreasonable discrimination,” which essentially means service providers must offer their products at similar rates and terms to similarly situated customers.
  • Importantly, broadband providers can manage their networks and use new technologies to provide fast, robust services. Also, there appears to be even more flexibility for wireless networks, though we don’t yet know the details.

(All the broad-brush concepts outlined today will need closer scrutiny when detailed language is unveiled, and as with every government regulation, implementation and enforcement can always yield unpredictable results. One also must worry about precedent and a new platform for future regulation. Even if today’s proposal isn’t too harmful, does the new framework open a regulatory can of worms?)

So, what appears to be off the table? Most of the worst proposals that have been flying around over the last year, like . . .

  • Reclassification of broadband as an old “telecom service” under Title II of the Communications Act of 1934, which could have pierced the no-government seal on the Internet in a very damaging way, unleashing all kinds of complex and antiquated rules on the modern Net.
  • Price controls.
  • Rigid nondiscrimination rules that would have barred important network technologies and business models.
  • Bans of quality-of-service technologies and techniques (QoS), tiered pricing, or voluntary relationships between ISPs and content/application/service (CAS) providers.
  • Open access mandates, requiring networks to share their assets.

Many of us have long questioned whether formal government action in this arena is necessary. The Internet ecosystem is healthy. It’s growing and generating an almost dizzying array of new products and services on diverse networks and devices. Communications networks are more open than ever. Facebook on your BlackBerry. Netflix on your iPad. Twitter on your TV. The oft-cited world broadband comparisons, which say the U.S. ranks 15h, or even 26th, are misleading. Those reports mostly measure household size, not broadband health. Using new data from Cisco, we estimate the U.S. generates and consumes more network traffic per user and per capita than any nation but South Korea. (Canada and the U.S. are about equal.) American Internet use is twice that of many nations we are told far outpace the U.S. in broadband. Heavy-handed regulation would have severely depressed investment and innovation in a vibrant industry. All for nothing.

Lots of smart lawyers doubt the FCC has the authority to issue even the relatively modest rules it outlined today. They’re probably right, and the question will no doubt be litigated (yet again), if Congress does not act first. But with Congress now divided politically, the case remains that Mr. Genachowski’s proposal is likely the near-term ceiling on regulation. Policy might get better than today’s proposal, but it’s not likely to get any worse. From what I see today, that’s a win for the Internet, and for the U.S. economy.

— Bret Swanson

One Step Forward, Two Steps Back

The FCC’s apparent about-face on Net Neutrality is really perplexing.

Over the past few weeks it looked like the Administration had acknowledged economic reality (and bipartisan Capitol Hill criticism) and turned its focus to investment and jobs. Outgoing NEC Director Larry Summers and Commerce Secretary Gary Locke announced a vast expansion of available wireless spectrum, and FCC chairman Julius Genachowski used his speech to the NARUC state regulators to encourage innovation and employment. Gone were mentions of the old priorities — intrusive new regulations such as Net Neutrality and Title II reclassification of modern broadband as an old telecom service. Finally, it appeared, an already healthy and vibrant Internet sector could stop worrying about these big new government impositions — and years of likely litigation — and get on with building the 21st century digital infrastructure.

But then came word at the end of last week that the FCC would indeed go ahead with its new Net Neutrality regs. Perhaps even issuing them on December 22, just as Congress and the nation take off for Christmas vacation [the FCC now says it will hold its meeting on December 15]. When even a rare  economic sunbeam is quickly clouded by yet more heavy-handedness from Washington, is it any wonder unemployment remains so high and growth so low?

Any number of people sympathetic to the economy’s and the Administration’s plight are trying to help. Last week David Leonhardt of the New York Times pointed the way, at least in a broad strategic sense: “One Way to Trim the Deficit: Cultivate Growth.” Yes, economic growth! Remember that old concept? Economist and innovation expert Michael Mandel has suggested a new concept of “countercyclical regulatory policy.” The idea is to lighten regulatory burdens to boost growth in slow times and then, later, when the economy is moving full-steam ahead, apply more oversight to curb excesses. Right now, we should be lightening burdens, Mandel says, not imposing new ones:

it’s really a dumb move to monkey with the vibrant and growing communications sector when the rest of the economy is so weak. It’s as if you have two cars — one running, one in the repair shop — and you decide it’s a good time to rebuild the transmission of the car that actually works because you hear a few squeaks.

Apparently, FCC honchos met with interested parties this morning to discuss what comes next. Unfortunately, at a time when we need real growth, strong growth, exuberant growth! (as Mandel would say), the Administration appears to be saddling an economy-lifting reform (wireless spectrum expansion) with leaden regulation. What’s the point of new wireless spectrum if you massively devalue it with Net Neutrality, open access, and/or Title II?

One step forward, two steps back (ten steps back?) is not an exuberant growth and jobs strategy.

The Regulatory Threat to Web Video

See our commentary at, responding to Revision3 CEO Jim Louderback’s calls for Internet regulation.

What we have here is “mission creep.” First, Net Neutrality was about an “open Internet” where no websites were blocked or degraded. But as soon as the whole industry agreed to these perfectly reasonable Open Web principles, Net Neutrality became an exercise in micromanagement of network technologies and broadband business plans. Now, Louderback wants to go even further and regulate prices. But there’s still more! He also wants to regulate the products that broadband providers can offer.

“In the Matter of Preserving the Open Internet”

Here were my comments in the FCC’s Notice of Proposed Rule Making on “Preserving the Open Internet” — better known as “Net Neutrality”:

A Net Neutrality regime will not make the Internet more “open.” The Internet is already very open. More people create and access more content and applications than ever before. And with the existing Four Principles in place, the Internet will remain open. In fact, a Net Neutrality regime could close off large portions of the Internet for many consumers. By intruding in technical infrastructure decisions and discouraging investment, Net Neutrality could decrease network capacity, connectivity, and robustness; it could increase prices; it could slow the cycle of innovation; and thus shut the window to the Web on millions of consumers. Net Neutrality is not about openness. It is far more accurate to say it is about closing off experimentation, innovation, and opportunity.

A Victory For the Free Web

After yesterday’s federal court ruling against the FCC’s overreaching net neutrality regulations, which we have dedicated considerable time and effort combatting for the last seven years, Holman Jenkins says it well:

Hooray. We live in a nation of laws and elected leaders, not a nation of unelected leaders making up rules for the rest of us as they go along, whether in response to besieging lobbyists or the latest bandwagon circling the block hauled by Washington’s permanent “public interest” community.

This was the reassuring message yesterday from the D.C. Circuit Court of Appeals aimed at the Federal Communications Commission. Bottom line: The FCC can abandon its ideological pursuit of the “net neutrality” bogeyman, and get on with making the world safe for the iPad.

The court ruled in considerable detail that there’s no statutory basis for the FCC’s ambition to annex the Internet, which has grown and thrived under nobody’s control.

. . .

So rather than focusing on new excuses to mess with network providers, the FCC should tackle two duties unambiguously before it: Figure out how to liberate the nation’s wireless spectrum (over which it has clear statutory authority) to flow to more market-oriented uses, whether broadband or broadcast, while also making sure taxpayers get adequately paid as the current system of licensed TV and radio spectrum inevitably evolves into something else.

Second: Under its media ownership hat, admit that such regulation, which inhibits the merger of TV stations with each other and with newspapers, is disastrously hindering our nation’s news-reporting resources and brands from reshaping themselves to meet the opportunities and challenges of the digital age. (Willy nilly, this would also help solve the spectrum problem as broadcasters voluntarily redeployed theirs to more profitable uses.)

Next Page »