Tag Archives: Fcc

20 Good Questions

Wyoming wireless operator Brett Glass has 20 questions for the FCC on Net Neutrality. Some examples:

1. I operate a public Internet kiosk which, to protect its security and integrity, has no way for the user to insert or connect storage devices. The FCC’s policy statement says that a provider of Internet service must allow users to run applications of their choice, which presumably includes uploading and downloading. Will I be penalized if I do not allow file uploads and downloads on that machine?

4. I operate a wireless hotspot in my coffeehouse. I block P2P traffic to prevent one user from ruining the experience for my other customers. Do the FCC rules say that I must stop doing this?

6. I am a cellular carrier who offers Internet services to users of cell phones. Due to spectrum limitations, multimedia streaming by more than a few users would consume all of the bandwidth we have available not only for data but also for voice calls. May we restrict these protocols to avoid running out of bandwidth and to avoid disruption to telephone calls (some of which may be E911 calls or other urgent traffic)?

7. I am a wireless ISP operating on unlicensed spectrum. Because the bands are crowded and spectrum is scarce, I must limit each user’s bandwidth and duty cycle. Rather than imposing hard limits or overage charges, I would like to set an implicit limit by prohibiting P2P, with full disclosure that I am doing so. Is this permitted under the FCC’s rules?

14. I am an ISP that accelerates users’ Web browsing by rerouting requests for Web pages to a Web cache (a device which speeds up Web browsing, conceived by the same people who developed the World Wide Web) and then to special Internet connections which are asymmetrical (that is, they have more downstream bandwidth than upstream bandwidth). The result is faster and more economical Web browsing for our users. Will the FCC say that our network “discriminates” by handling Web traffic in this special way to improve users’ experience?

15. We are an ISP that improves the quality of VoIP by prioritizing VoIP packets and sending them through a different Internet connection than other traffic. This technique prevents users from experiencing problems with their telephone conversations and ensures that emergency calls will get through. Is this a violation of the FCC’s rules?

18. We’re an ISP that serves several large law offices as well as other customers. We are thinking of renting a direct  “fast pipe” to a legal research database to shorten the attorneys’ response times when they search the database. Would accelerating just this traffic for the benefit of these customers be considered “discrimination?”

19. We’re a wireless ISP. Most of our customers are connected to us using “point-to-multipoint” radios; that is, the customers’ connection share a single antenna at our end. However, some high volume customers ask to buy dedicated point-to-point connections to get better performance. Do these connections, which are engineered by virtually all wireless ISPs for high bandwidth customers, run afoul of the FCC’s rules against “discrimination?”

Commone Sense of Amazonian Proportions

Amazon’s Paul Misener gets all reasonable in his comments on the FCC’s proposed net neutrality rules:

With this win-win-win goal in mind, and consistent with the principle of maintaining an open Internet, Amazon respectfully suggests that the FCC’s proposed rules be extended to allow broadband Internet access service providers to favor some content so long as no harm is done to other content.

Importantly, we note that the Internet has long been interconnected with private networks and edge caches that enhance the performance of some Internet content in comparison with other Internet content, and that these performance improvements are paid for by some but not all providers of content.  The reason why these arrangements are acceptable from a public policy perspective is simple:  the performance of other content is not disfavored, i.e., other content is not harmed.

Reading 15,000 documents so you don’t have to

For those of you not wishing to sift through 15,000 comments submitted to the FCC for its Net Neutrality proposed rule making, let me recommend what — so far — is the best technical filing I’ve read. It comes from Richard Bennett and Rob Atkinson of the Information Technology Innovation Foundation.

Also very useful is a new post by George Ou on content delivery and paid peering, with important policy implications.

These are among the least discussed — but most important — items in the whole Net Neutrality debate.

Separately, from the FCC’s “Open Internet” meeting at MIT last week, see summaries of each panelist’s remarks: Opening Presentations, Panel 1, Panel 2.

Berkman’s Broadband Bungle

Professors at a leading research unit put suspect data into a bad model, fail to include crucial variables, and even manufacture the most central variable to deliver the hoped-for outcome.

Climate-gate? No, call it Berkman’s broadband bungle.

In October, Harvard’s Berkman Center for the Internet and Society delivered a report, commissioned by the Federal Communications Commission, comparing international broadband markets and policies. The report was to be a central component of the Administration’s new national broadband Internet policy, arriving in February 2010.

Just one problem. Actually many problems. The report botched its chief statistical model in half a dozen ways. It used loads of questionable data. It didn’t account for the unique market structure of U.S. broadband. It reversed the arrow of time in its country case studies. It ignored the high-profile history of open access regulation in the U.S. It didn’t conduct the literature review the FCC asked for. It excommunicated Switzerland . . . .

See my critique of this big report on international broadband at RealClearMarkets.

Preparing to Pounce: D.C. angles for another industry

As you’ve no doubt heard, Washington D.C. is angling for a takeover of the . . . U.S. telecom industry?!

That’s right: broadband, routers, switches, data centers, software apps, Web video, mobile phones, the Internet. As if its agenda weren’t full enough, the government is preparing a dramatic centralization of authority over our healthiest, most dynamic, high-growth industry.

Two weeks ago, FCC chairman Julius Genachowski proposed new “net neutrality” regulations, which he will detail on October 22. Then on Friday, Yochai Benkler of Harvard’s Berkman Center published an FCC-commissioned report on international broadband comparisons. The voluminous survey serves up data from around the world on broadband penetration rates, speeds, and prices. But the real purpose of the report is to make a single point: foreign “open access” broadband regulation, good; American broadband competition, bad. These two tracks — “net neutrality” and “open access,” combined with a review of the U.S. wireless industry and other investigations — lead straight to an unprecedented government intrusion of America’s vibrant Internet industry.

Benkler and his team of investigators can be commended for the effort that went into what was no doubt a substantial undertaking. The report, however,

  • misses all kinds of important distinctions among national broadband markets, histories, and evolutions;
  • uses lots of suspect data;
  • underplays caveats and ignores some important statistical problems;
  • focuses too much on some metrics, not enough on others;
  • completely bungles America’s own broadband policy history; and
  • draws broad and overly-certain policy conclusions about a still-young, dynamic, complex Internet ecosystem.

The gaping, jaw-dropping irony of the report was its failure even to mention the chief outcome of America’s previous open-access regime: the telecom/tech crash of 2000-02. We tried this before. And it didn’t work! The Great Telecom Crash of 2000-02 was the equivalent for that industry what the Great Panic of 2008 was to the financial industry. A deeply painful and historic plunge. In the case of the Great Telecom Crash, U.S. tech and telecom companies lost some $3 trillion in market value and one million jobs. The harsh open access policies (mandated network sharing, price controls) that Benkler lauds in his new report were a main culprit. But in Benkler’s 231-page report on open access policies, there is no mention of the Great Crash. (more…)

Neutrality for thee, but not for me

In Monday’s Wall Street Journal, I address the once-again raging topic of “net neutrality” regulation of the Web. On September 21, new FCC chair Julius Genachowski proposed more formal neutrality regulations. Then on September 25, AT&T accused Google of violating the very neutrality rules the search company has sought for others. The gist of the complaint was that the new Google Voice service does not connect all phone calls the way other phone companies are required to do. Not an earthshaking matter in itself, but a good example of the perils of neutrality regulation.

As the Journal wrote in its own editorial on Saturday:

Our own view is that the rules requiring traditional phone companies to connect these calls should be scrapped for everyone rather than extended to Google. In today’s telecom marketplace, where the overwhelming majority of phone customers have multiple carriers to choose from, these regulations are obsolete. But Google has set itself up for this political blowback.

Last week FCC Chairman Julius Genachowski proposed new rules for regulating Internet operators and gave assurances that “this is not about government regulation of the Internet.” But this dispute highlights the regulatory creep that net neutrality mandates make inevitable. Content providers like Google want to dabble in the phone business, while the phone companies want to sell services and applications.

The coming convergence will make it increasingly difficult to distinguish among providers of broadband pipes, network services and applications. Once net neutrality is unleashed, it’s hard to see how anything connected with the Internet will be safe from regulation.

Several years ago, all sides agreed to broad principles that prohibit blocking Web sites or applications. But I have argued that more detailed and formal regulations governing such a dynamic arena of technology and changing business models would stifle innovation.

Broadband to the home, office, and to a growing array of diverse mobile devices has been a rare bright spot in this dismal economy. Since net neutrality regulation was first proposed in early 2004, consumer bandwidth per capita in the U.S. grew to 3 megabits per second from just 262 kilobits per second, and monthly U.S. Internet traffic increased to two billion gigabytes from 170 million gigabytes — both 10-fold leaps. New wired and wireless innovations and services are booming.

All without net neutrality regulation.

The proposed FCC regulations could go well beyond the existing (and uncontroversial) non-blocking principles. A new “Fifth Principle,” if codified, could prohibit “discrimination” not just among applications and services but even at the level of data packets traversing the Net. But traffic management of packets is used across the Web to ensure robust service and security.

As network traffic, content, and outlets proliferate and diversify, Washington wants to apply rigid, top-down rules. But the network requirements of email and high-definition video are very different. Real time video conferencing requires more network rigor than stored content like YouTube videos. Wireless traffic patterns are more unpredictable than residential networks because cellphone users are, well, mobile. And the next generation of video cloud computing — what I call the exacloud — will impose the most severe constraints yet on network capacity and packet delay.

Or if you think entertainment unimportant, consider the implications for cybersecurity. The very network technologies that ensure a rich video experience are used to kill dangerous “botnets” and combat cybercrime.

And what about low-income consumers? If network service providers can’t partner with content companies, offer value-added services, or charge high-end users more money for consuming more bandwidth, low-end consumers will be forced to pay higher prices. Net neutrality would thus frustrate the Administration’s goal of 100% broadband.

Health care, energy, jobs, debt, and economic growth are rightly earning most of the policy attention these days. But regulation of the Net would undermine the key global platform that underlay better performance on each of these crucial economic matters. Washington may be bailing out every industry that doesn’t work, but that’s no reason to add new constraints to one that manifestly does.

— Bret Swanson

Does Google Voice violate neutrality?

This is the ironic but very legitimate question AT&T is asking.

As Adam Thierer writes,

Whatever you think about this messy dispute between AT&T and Google about how to classify web-based telephony apps for regulatory purposes — in this case, Google Voice — the key issue not to lose site of here is that we are inching ever closer to FCC regulation of web-based apps!  Again, this is the point we have stressed here again and again and again and again when opposing Net neutrality mandates: If you open the door to regulation on one layer of the Net, you open up the door to the eventual regulation of all layers of the Net.

George Gilder and I made this point in Senate testimony five and a half years ago. Advocates of big new regulations on the Internet should be careful for what they wish.

End-to-end? Or end to innovation?

In what is sure to be a substantial contribution to both the technical and policy debates over Net Neutrality, Richard Bennett of the Information Technology and Innovation Foundation has written a terrific piece of technology history and forward-looking analysis. In “Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate,” Bennett concludes:

Arguments for freezing the Internet into a simplistic regulatory straightjacket often have a distinctly emotional character that frequently borders on manipulation.

The Internet is a wonderful system. It represents a new standard of global cooperation and enables forms of interaction never before possible. Thanks to the Internet, societies around the world reap the benefits of access to information, opportunities for collaboration, and modes of communication that weren’t conceivable to the public a few years ago. It’s such a wonderful system that we have to strive very hard not to make it into a fetish object, imbued with magical powers and beyond the realm of dispassionate analysis, criticism, and improvement.

At the end of the day, the Internet is simply a machine. It was built the way it was largely by a series of accidents, and it could easily have evolved along completely different lines with no loss of value to the public. Instead of separating TCP from IP in the way that they did, the academics in Palo Alto who adapted the CYCLADES architecture to the ARPANET infrastructure could have taken a different tack: They could have left them combined as a single architectural unit providing different retransmission policies (a reliable TCP-like policy and an unreliable UDP-like policy) or they could have chosen a different protocol such as Watson’s Delta-t or Pouzin’s CYCLADES TS. Had the academics gone in either of these directions, we could still have a World Wide Web and all the social networks it enables, perhaps with greater resiliency.

The glue that holds the Internet together is not any particular protocol or software implementation: first and foremost, it’s the agreements between operators of Autonomous Systems to meet and share packets at Internet Exchange Centers and their willingness to work together. These agreements are slowly evolving from a blanket pact to cross boundaries with no particular regard for QoS into a richer system that may someday preserve delivery requirements on a large scale. Such agreements are entirely consistent with the structure of the IP packet, the needs of new applications, user empowerment, and “tussle.”

The Internet’s fundamental vibrancy is the sandbox created by the designers of the first datagram networks that permitted network service enhancements to be built and tested without destabilizing the network or exposing it to unnecessary hazards. We don’t fully utilize the potential of the network to rise to new challenges if we confine innovations to the sandbox instead of moving them to the parts of the network infrastructure where they can do the most good once they’re proven. The real meaning of end-to-end lies in the dynamism it bestows on the Internet by supporting innovation not just in applications but in fundamental network services. The Internet was designed for continual improvement: There is no reason not to continue down that path.

Leviathan Spam

Leviathan Spam

Send the bits with lasers and chips
See the bytes with LED lights

Wireless, optical, bandwidth boom
A flood of info, a global zoom

Now comes Lessig
Now comes Wu
To tell us what we cannot do

The Net, they say,
Is under attack
Before we can’t turn back

They know best
These coder kings
So they prohibit a billion things

What is on their list of don’ts?
Most everything we need the most

To make the Web work
We parse and label
We tag the bits to keep the Net stable

The cloud is not magic
It’s routers and switches
It takes a machine to move exadigits

Now Lessig tells us to route is illegal
To manage Net traffic, Wu’s ultimate evil (more…)

A New Leash on the Net?

Today, FCC chairman Julius Genachowski proposed new regulations on communications networks. We were among the very first opponents of these so-called “net neutrality” rules when they were first proposed in concept back in 2004. Here are a number of our relevant articles over the past few years:

Info-tech = recovery

In testimony before Congress’s Joint Economic Committee today, Fed chairman Ben Bernanke noted that

In contrast to the somewhat better news in the household sector, the available indicators of business investment remain extremely weak.

But it is these key business sectors that are most important for a U.S. — and global — economic recovery. As important as stabilization of the housing sector is, we are not going to be led out of the recession by another housing boom. Nor should we desire that. We need real productivity-enhancing innovation, which is largely enabled by non-real estate investment and entrepreneurship.

Among the myriad policy actions being taken in Washington this year is a potential overhaul of our communications strategy, under the aegis of the FCC’s new Broadband “Notice of Inquiry.” The first goal of this plan should be to to encourage the continued investment in leading-edge information technologies. Broadband communications especially makes all our businesses in every sector more productive and also connects an ever larger number of citizens, especially those who may be struggling the most in this tough economy, to the wider world, improving their prospects for education, health, and new jobs in emerging industries.

Information and communications technology (ICT) accounts for an astounding 43% of non-structure U.S. capital investment, totaling $455 billion 2008. In this new FCC communications policy review, we should do everything possible to keep this huge source of American growth rolling. Any policy obstacles thrown into the path of our information industries would not only reduce this crucial component of absolute capital investment, which is already under strain, but also diminish and delay all the positive cascading follow-on effects of a more networked workforce and world.

« Previous Page