A Third Way on Network Neutrality

Subscriber Only
Sign in or Subscribe Now for audio version

It is not often that a relatively technical telecommunications policy issue receives as much attention as “network neutrality.” The central question is whether broadband network providers — for example, cable and telephone companies — can prioritize the data they transmit to give an advantage to the most important or most profitable traffic. Traditionally, consumer Internet service has been largely wide open, with no preference given to one kind of traffic over another; this is called “best-efforts” service. The net neutrality movement is an effort to preserve this open system, through government regulation, in the broadband age and beyond.

To its supporters, net neutrality is a way of protecting innovation by ensuring that all Internet traffic is treated equally. To its opponents, it is a threat to innovation because it inhibits network providers who believe that the capital raised by charging for “tiered service” would enable major improvements in broadband infrastructure. In reality, both sides are partially right, even as they portray one another as misguided or pernicious. But there is another position — a “third way,” so to speak — that will enable the development of enhanced networks while at the same time ensuring a robust, open, best-efforts Internet.

Today’s Internet blossomed in an age of narrowband, dial-up connections and an “end-to-end” open architecture. This architecture allowed all application developers to make their innovations available to the world by placing software on a publicly accessible server. It enabled companies like Google and eBay to come out of nowhere — a garage, if you will — to find profits and success, and to contribute greatly to the Internet economy.

But while this “neutral” Internet promises an open platform for garage innovators, it also represents a dubious platform for deploying applications that require assurances about the quality of service — for example, a two-way video application, like a telemedicine application, which needs high speeds and low latency (delay). Moreover, a completely open architecture is vulnerable to threats such as viruses and denial-of-service attacks. To address these issues, broadband providers want to offer upgraded, enhanced networks for applications that would fail to perform effectively if offered via the ordinary, wide-open Internet. In essence, by tagging content, or by hosting it on their own dedicated servers, broadband providers would ensure that their own data packets, or those from companies paying for this service, get preferential treatment and reach subscribers faster than content delivered over the best-efforts Internet.

Until last year, some broadband network providers were required to treat broadband traffic on a “common carrier” (i.e., non-discriminatory) basis. But after the Supreme Court ruled in the Brand X decision that cable broadband service was an “information service” and not a “telecommunications service” under the Telecommunications Act, and after the Federal Communications Commission (FCC) ruled that DSL broadband service provided by telephone companies also fell into this lightly regulated category, broadband network providers were freed to move away from open and neutral networks toward networks that provide enhanced services for an additional fee.

In recent months, network neutrality has become a hotly contested and emotional policy debate. On one side are strident proponents of regulation, who claim that any violation of the best-efforts principle endangers the Internet. For Stanford Law Professor Lawrence Lessig, for example, the advent of enhanced Internet networks and the demise of end-to-end open architecture portends a permanent shift in the character of the Internet, a major blow to the kind of freedom and innovation that have characterized it to date. On the other side are those who focus on the Internet’s unregulated history and fear that any regulation will stifle its further development. For example, tech guru George Gilder has said a “net neutrality measure would just put a stop” to new investment in network infrastructure.

Within the business community, many information technology companies, particularly those providing applications and services (including Google, Yahoo, Amazon, and eBay) strongly support net neutrality regulation that would preclude network operators from operating enhanced networks. In contrast, telecommunications companies (including the Bells and cable companies) strongly oppose net neutrality regulation; they speak instead of the smarter “Internet of the Future.”

Both sides in the debate have advocacy and public policy groups supporting their positions. Both sides have major editorial pages in their corner (with the New York Times pulling for net neutrality, the Washington Post against it). Both sides have snazzy websites (like SaveTheInternet.com and HandsOff.org). Politically, most liberals seem to support net neutrality regulation, while conservatives are split — with the Christian Coalition and the American Conservative Union, for example, pitted against each other. There are divisions among libertarians, too, with passionate bloggers and activist groups on both sides of the debate.

While there is some political crossover in Congress, the issue has largely played out along party lines, with many Republicans opposing net neutrality regulation and many Democrats supporting it. Each side hurls epithets at the other, claiming that the other side’s position would destroy the Internet as we know it. It is time to find a better, bipartisan way forward.

The Contours of the Debate

Properly understood, there are three distinct issues at the heart of the network neutrality debate: transparency, blocking, and tiering.

Transparency. This issue relates to how clearly broadband providers state the policies that govern the uses of their networks. To date, this concern has not received much attention, but it is likely to grow in importance as broadband networks become more differentiated and adopt increasingly varied usage policies.

Blocking. This issue concerns whether broadband providers can block or degrade consumer access to certain applications and content. When these concerns first materialized, then-FCC Chairman Michael Powell set forth the concept of “Internet freedom,” calling on all providers to allow access to applications and devices that did not harm the network. Over time, most (if not all) of the major broadband providers, including Verizon and AT&T and the major telecom and cable trade associations (the U.S. Telecommunications Association and the National Cable Television Association) publicly committed not to degrade or block Internet traffic. Subsequently, the FCC adopted a slightly revised version of these freedoms in a major policy statement on broadband in 2005. Today, most policy observers agree that any effort to block or degrade traffic — unless justified by a legitimate business purpose (such as protecting the network) — should be illegal. Moreover, the FCC arguably already has authority to address such practices — as it did when it halted the blocking of Vonage’s Voice over Internet Protocol (VoIP) service by Madison River Communications, a rural telephone company.

Tiering. The aspect of network neutrality that currently attracts the lion’s share of attention is the question of tiering — that is, whether broadband providers should have the right to charge application and content providers higher fees for a higher quality of network service, and whether they can provide higher quality of service guarantees for their own applications than for rival ones. As all parties in this debate agree, broadband operators should be able to charge consumers for different levels of broadband service. The controversy over “tiering” is thus whether broadband operators should be able to charge application and content providers different rates for different levels of service — like charging higher tolls to ride on faster lanes.

In each of these three areas, the current state of the network neutrality debate denies the reasonable concerns articulated by each side and obscures the contours of a sensible solution. To find a better way forward, we need first to weigh the factual and economic claims each side makes.

The Proponents’ View

To listen to some of the more strident proponents of net neutrality, any violation of the best-efforts principle is sacrilege. For example, the liberal group MoveOn.org warns that “Internet freedom is under attack as Congress pushes a law that would give companies like AT&T the power to control what you do online.”

The concerns articulated by proponents of net neutrality touch on an important issue: an Internet where an innovator has to ask permission (and pay potentially significant fees) before deploying a new technology threatens the Internet’s golden goose of allowing innovation over an open platform. As the CEOs of several major Internet and information technology companies, including Google, Microsoft and Intel, put it, “innovation without permission” represents “the essence of the Internet.”

But proponents of net neutrality also overlook a series of important concerns and unintended consequences that could flow from mandating a single-tiered Internet. For starters, consider the fact that investment in broadband networks is an extraordinarily expensive undertaking. The network providers can, and in our view should, continue to be allowed to recoup their investment, not only by charging for different levels of service to their customers, but also by finding other opportunities for recouping revenue from providers of broadband-intensive applications.

The desire of the network operators to find new revenue opportunities can be explained by the concept of price discrimination. For any company to invest a significant amount of money in a fixed cost asset (such as building a movie theater, developing a blockbuster drug, or deploying a broadband network), there needs to be a payoff expected. Price discrimination gets a bad name in part because it sounds sinister (as does anything with “discrimination” in the title). Indeed, former Clinton administration Secretary of Labor Robert Reich argues that while discriminatory pricing on the Internet may be “efficient, it’s not democratic.” This argument, however, would also justify bans on all sorts of price discrimination arrangements, such as the airlines’ practice of providing first and business class services to customers who are willing to pay more. In the context of net neutrality, discriminatory pricing has gotten a bad name in part because network owners have failed to describe their pursuit of new revenue opportunities in consumer-friendly terms. A case in point is how AT&T CEO Ed Whitacre (then CEO of SBC) described his view of the applications that travel over his company’s broadband network:

Now what [Google and other Internet content providers] would like to do is use my pipes free, but I ain’t going to let them do that because we have spent this capital and we have to have a return on it. So there’s going to have to be some mechanism for these people who use these pipes to pay for the portion they’re using. Why should they be allowed to use my pipes?

The most mystifying part of Whitacre’s explanation for charging applications providers is not just that it is bad public relations — this statement alone helped to fuel regulatory concerns — but that it is both wrong and bad business. The notion that Google and other applications receive a “free ride” mischaracterizes how the Internet works. Google and other Internet applications pay fees to upload data onto the public Internet; they are no more free-riding than a driver who drives on a public road and pays gas taxes. Likewise, broadband customers pay fees to download Google and other Internet applications. (If anyone is free-riding on AT&T’s networks, it is the relatively small number of bandwidth hogs, like those who share or download large media files, who account for a large share of bandwidth consumed without paying extra to support their extra network use.) Moreover, without the Googles of the world — who make broadband networks more valuable by enhancing their functionality — the AT&Ts of the world would have to charge less for broadband Internet access.

In a more sensible (and tactful) move, Richard Notebaert, Qwest’s CEO, recently explained that he views Google and Amazon as valued customers whose applications enhance the value of Qwest’s DSL offering to consumers. He proceeded to explain that Qwest should also be able to offer premium services, for additional fees, that guarantee certain levels of service — much as FedEx offers L.L. Bean expedited service for holiday shipping. To date, few such deals have been announced, but one can readily imagine win-win deals where a video applications provider contracts for guaranteed delivery speeds (say, 5 megabits per second) to all broadband customers — even if a particular broadband subscriber only pays for a lower level of bandwidth for best-efforts Internet access (say, 512 kilobits per second). Yet to the net neutrality crowd, even such win-win deals threaten to introduce a new online oligarchy.

The Opponents’ View

Opponents of net neutrality focus on the Internet’s historically unregulated nature; they argue that regulation will strangle its development and prevent the super-fast “Internet of the Future” from taking shape. They defend their position by pointing out that in many markets, companies offer tiers of service differentiated by price. As Randy May of the Progress and Freedom Foundation notes, if a company would like to deliver physical content to a customer, they can use the lower-cost “best-efforts” U.S. Postal Service (USPS) first-class mail, or they can pay more to use USPS “Express Mail” or a host of private shippers like UPS and FedEx. Unlike the current broadband market, however, prices for “best-efforts” mail service are regulated (i.e., the 39-cent stamp), while the market for express delivery services is relatively unregulated and competitive.

Opponents of any network neutrality regulation often maintain that competition between broadband providers is a sufficient check on the possibility of anticompetitive conduct. Unfortunately, the current reality of the broadband market is that in most local markets there are only two principal competitors — the incumbent telephone companies (with their DSL offering) and the incumbent cable companies (with their cable modem offering). Indeed, for the foreseeable future, the so-called “last mile” of broadband services is for most consumers and in most places at best a duopoly with two and sometimes just one provider. To be sure, the FCC reports that 75 percent of zip codes have three or more broadband providers. However, the inclusion of satellite broadband services in this measure skews the actual competitiveness of the market, as satellite is generally not a viable substitute for DSL or cable modem service because of higher prices and slower speeds. Consequently, the reality is that most Americans have a choice between only two (or fewer) providers of broadband service.

For some critics of network neutrality regulation, the fact that cable and DSL providers are competing quite intensely compensates for the fact that the broadband market is currently a duopoly. And indeed, with only about 35 percent of all households currently subscribing to broadband, cable and telephone companies are vigorously seeking to attract new customers. But once the vast majority of households have adopted broadband, a market with only two dominant providers could easily become a market where the providers are able to exercise their market power in ways that threaten Internet innovation.

Another argument invoked by network neutrality critics is that broadband competition, even if not truly here yet, is certainly going to emerge. That may be so, but it is critical to acknowledge that it is far from clear when, or even if, a “third broadband pipe” (such as wireless, satellite, or broadband-over-power-lines) will emerge as an effective competitor to cable and phone companies. Notably, even under the best of circumstances, it will not be easy for any such provider to emerge and deploy the expensive, essentially duplicative networks necessary to compete with the entrenched incumbents, particularly when some customers will be reluctant, in the face of significant costs and hassles associated with switching broadband providers, to move from an established incumbent to a new entrant. For wireless broadband providers in particular, the circumstances are far from ideal, as spectrum policy continues to restrict the available spectrum that can be used by would-be wireless broadband providers.

Opponents of net neutrality regulation rightly point out that broadband providers — Ed Whitacre’s rhetoric aside — benefit from lots of applications that ride on their networks and therefore have no incentive to block or treat them unfairly. Those who make this argument must recognize, however, that there are exceptions to this general principle, such as when a company’s revenue stream can be endangered by some of the applications it allows. In the case of Internet telephony, to return to an example mentioned earlier, Madison River Communications resorted to the extreme tactic of blocking Vonage’s VoIP service. For Madison River, its interest in protecting its own existing voice-based revenues overrode its interest in providing a more valuable broadband service. Going forward, as Internet-based video options take off, it is quite possible that cable providers (and telephone companies offering video services) may face similar incentives to restrict video-over-Internet offerings. Consequently, with a limited level of competition and a plausible risk of market power abuses, the case for regulatory oversight cannot be categorically dismissed.

Looking beyond the Madison River case, the anticompetitive tactics that incumbent providers might use are not limited to the ability to block competitive applications, but also include other means of placing rival services at a distinct disadvantage. In particular, incumbent broadband providers are likely to face the temptation to invest resources into a bigger, “pay-to-play” pipe — that is, new infrastructure with more bandwidth and higher use charges — while keeping (or diminishing) their existing best-effort networks at a level that would make many voice or video Internet services a low-quality offering. This would allow incumbents to protect their core businesses (video for cable companies, voice for phone companies) from Internet competition. It would also potentially give broadband providers an incentive to confine the open, best-efforts Internet to a dirt side road while the new, more robust, pay-to-play system becomes the long-sought information superhighway.

Extreme Proposals

The current legislative landscape, reflecting the polarized state of debate, largely focuses on extreme approaches to the issue. For example, one bill — the Communications Opportunity, Promotion, and Enhancement Act (H.R. 5252), sponsored by Representative Joe Barton (R.-Tex.) and passed by the House in June 2006 — might well provide less regulatory oversight than exists under current law. In particular, by providing only specifically limited regulatory authority to the FCC, the bill arguably cuts back on the existing scope of the FCC’s “ancillary jurisdiction” authority to regulate broadband providers. Moreover, by establishing a prescribed regulatory regime, the Barton bill risks limiting the scope of available antitrust oversight under the Supreme Court’s Trinko decision. A similar bill under consideration in the Senate — the Communications, Consumer’s Choice, and Broadband Deployment Act (S. 2686), sponsored by Senator Ted Stevens (R.-Alaska) — likewise provides limited regulatory oversight.

In contrast, bills calling for more aggressive network neutrality regulation would place excessive restrictions on the freedom of broadband providers. For example, a bill sponsored by Representative Ed Markey (D.-Mass.), the Network Neutrality Act (H.R. 5273), would limit the opportunity of broadband providers to offer and charge for higher quality of service levels. A similar bill in the Senate — the Internet Freedom Preservation Act (S. 2917), sponsored by Senators Snowe (R.-Me.) and Dorgan (D.-N.D.) — prohibits the fee-based prioritization of Internet traffic. Another, the Internet Non-Discrimination Act (S. 2360), sponsored by Senator Ron Wyden (D.-Ore.), would ban any varying levels (or tiers) of Internet service available to content or service providers. In practice, this legislation would prohibit a broadband provider from offering special treatment to any application — even if such arrangements facilitate the development of an entirely new product or service (such as those requiring guaranteed levels of service). To Senator Wyden, such a trade-off is warranted because “creating a two-tiered system could have a chilling effect on small mom and pop businesses that can’t afford the priority lane, leaving these smaller businesses no hope of competing against the Wal-Marts of the world.”

The many other bills and amendments recently debated in Congress generally represent similarly extreme approaches — with one side allowing an unfettered right of broadband providers to prioritize traffic on their networks (the Barton approach) and the other side prohibiting any prioritization of traffic (the Markey approach).

A Moderate Proposal

What is missing from this debate is a sensible, centrist solution, one that would allow broadband providers to offer and charge for enhanced network services while providing for some form of regulatory oversight to ensure that the current broadband providers do not abuse their market power. Such an approach would also assure that a reasonably sized, open, and best-efforts Internet pipe is available for innovators. This “third way” should have three prongs: effective consumer protection measures, sound competition policy oversight, and conditioned tax incentives.

Consumer Protection. As described above, it is likely that there will be increasing concerns as to whether broadband usage policies are transparent — that is, clearly delineated and well understood. To the extent that they are, it is quite possible that the most effective protection for consumers will be their own vigilance about what services network providers offer them. To facilitate such vigilance, all providers should be required to state clearly to what extent content and services enjoy preferential delivery opportunities and to what extent limitations exist on the ability of consumers to access the content and services of their choice.

Once broadband providers post policies specifying their service offerings, the FCC will be well-positioned to monitor whether firms comply in practice with their own stated policies. Indeed, a notice and monitoring regime would mirror the Federal Trade Commission’s (FTC) approach to Internet privacy, which encourages firms to be clear about their privacy policies and penalizes those firms that fail to comply with them. When it comes to broadband usage policies (and unlike the FTC privacy regime), the posting of a firm’s policies should be required, not left to each company’s own discretion.

Also important to protecting consumers would be requiring any firm selling “broadband Internet access” to make available a basic level of open, best-efforts Internet access. There are many legitimate reasons for providing differential levels of service, performance, pricing, and prioritization in the broadband environment. But it is critical — both in terms of satisfying consumer expectations and in facilitating innovation by upstart firms — that some not insignificant portion of the broadband bandwidth be available on a best-efforts basis.

Over time, we believe that the level of best-efforts broadband access will evolve. At present, the FCC defines the level of broadband access as 200 kilobits per second or greater in at least one direction. This definition is already out-of-date and will become more so over time. It will be critical that the FCC develop an evolving measure of broadband access that providers will deliver on a best-efforts basis. At present, there is no ready formula for defining this level of broadband, but going forward, the level of bandwidth and associated latency should be defined with an eye to supporting the basic uses of the Internet as they evolve over time. As an initial matter, what can now be called “broadband” should be closer to 2 megabits per second download speed rather than the current 200 kilobits per second. (According to the FCC, as of mid-2005, 54 percent of high-speed lines provided speeds of at least 2.5 megabits per second in the faster direction, almost always the download direction.) Under the regime we propose, network providers with market power that do not meet that FCC-defined requirement would be prohibited from calling their services “broadband.”

Competition Policy. The second prong of this proposed regime is to charge the FCC with an after-the-fact competition policy enforcement mandate akin to the antitrust laws. This approach differs from the agency’s standard before-the-fact rulemaking mission (as well as the approach of the Markey and Wyden bills, which emphasize before-the-fact rules). The problem with rules that limit behavior before-the-fact is that they often sweep broadly and address speculative harms. Moreover, such rules create incentives for gamesmanship, such as an effort to have a video-over-Internet service classified as a “cable service” and thus outside the scope of any network neutrality regulations. By contrast, an after-the-fact approach provides regulatory flexibility, viewing discriminatory conduct by providers with market power with a degree of skepticism, but judging such conduct on a case-by-case basis.

As a starting point for such oversight, the FCC should rely on the set of policy principles that then-Chairman Powell announced in his 2004 “Internet Freedoms” pronouncement and that the agency in 2005 adapted to its major broadband policy statement. These principles recognize that it is indeed possible that incumbent broadband providers would respond to Internet-enabled applications such as VoIP service by using such tactics as slowing down the service, giving precedence to their own similar services, or by charging competitors uncompetitive rates to send data over their managed network. By promising prompt and effective enforcement and consequences for anticompetitive conduct, the FCC can ensure that upstarts like Vonage can succeed or fail on their own merits and not because of anticompetitive conduct.

To appreciate how this proposed model would work in practice, consider the following hypothetical. Imagine an allegation by Amazon.com that the Barnes & Noble website was receiving a quality-of-service guarantee not offered to Amazon on electronic book downloads. To remedy this state of affairs, Amazon could commence an FCC proceeding — one that would be governed by strict time limits — alleging that the selective offering of this quality-of-service guarantee was anticompetitive. To the extent that the broadband provider could justify this preferential arrangement as a legitimate business arrangement — say, that there was only sufficient bandwidth to provide this service to one of the two firms — it could be upheld. And to the extent that a broadband provider could not exist at all unless it created certain preferential arrangements, that might be a viable defense against the charge that it was excluding competition. If a broadband provider could not offer a convincing justification, the practice would be condemned and the FCC would be authorized not only to enjoin the anticompetitive practice, but to penalize the firm that took the condemned action.

Notably, this model of competition-protecting regulation would allow quality-of-service assurances to be offered for payment, but such assurances would have to be offered universally unless a firm has a legitimate business purpose for offering it only on an exclusive basis. Significantly, this standard of reasonable access to prioritized service delivery (even for a fee) would also apply to the level of prioritization that existing broadband providers give to their own affiliated applications (say, their VoIP product). Admittedly, monitoring the access arrangements a company gives to its own services may well present regulatory challenges. To the extent that such challenges cannot be addressed on a case-by-case basis, it may be necessary to adopt more aggressive forms of oversight (such as accounting safeguards or the use of benchmarks to “impute” the terms and conditions offered to an incumbent’s affiliated service).

Like the antitrust laws, it is possible that certain discriminatory practices will be categorically condemned. Such a condemnation, however, should only come with a better understanding of the actual effect of the practice at issue. Even port blocking, for example, might be defensible under certain circumstances. Where a new entrant to the broadband market blocked traffic, there might be a reason to believe that such behavior reflected a legitimate business purpose. For example, in the recent case of Clearwire, an upstart wireless broadband provider, it reportedly decided to block rival VoIP services because this practice enabled it to receive funding from Bell Canada in return for Bell Canada’s exclusive right to offer VoIP services on Clearwire’s network. To the extent that Clearwire would not be able to operate a wireless broadband service at all without such funding, consumers are better off with the presence of a competitor — even one who blocks rival VoIP offerings — than with no broadband competitor at all.

In short, this proposed regime envisions that the FCC can superintend an antitrust model of regulation. This model would require that the FCC manage all relevant proceedings on an expedited basis, so that a firm that suspected discrimination in favor of a competitor could commence a proceeding to challenge that practice and be assured of a timely response. As we noted at the outset, the FCC arguably possesses the authority today (under its ancillary jurisdiction) to implement this model of regulation, but it would be prudent for Congress to confirm this authority and specifically embrace this form of regulation.

In recommending this new regime, we recognize that it envisions a different role for the FCC than its traditional regulatory function. Given its lack of experience in these areas, we acknowledge that it is an open question whether or not the FCC can perform this new role effectively. Consequently, this proposal may well require significant institutional reforms of how the agency operates. Moreover, because it is possible that even a reformed FCC will be unable to perform this role effectively, Congress should focus on the agency’s institutional limitations, closely monitor its performance in carrying out the duties we propose for it, and, if necessary, consider assigning these functions to a different agency, such as the Federal Trade Commission.

Depreciation and Tax Incentives. Investment in broadband networks exhibit what economists call positive externalities — that is, the investments generate economic and social benefits greater than those captured by the company making the investment. For example, a widely deployed 20 megabits per second network could enable a whole host of applications, such as telemedicine, telecommuting, distance learning, and others. In markets where the social benefits (or costs) differ from private ones, it is not uncommon for policymakers to respond with taxes or tax incentives. To give a familiar example, because companies cannot capture all the positive returns from conducting research and development, Congress created the R&D tax credit.

To spur more ubiquitous, high-speed broadband deployment, Congress should do something similar today. We suggest two actions. First, Congress should allow companies investing in broadband networks to expense new broadband investments in the first year. Currently, companies must depreciate telecommunications network investments over a period of 15 years. Allowing companies to write off the investment in the first year reduces the costs of making these investments and encourages faster deployment of higher speed networks. Other nations have used this approach successfully to spur deployment of advanced telecommunications infrastructures. For example, the Japanese government allowed NTT to rapidly write off the cost of its new fiber broadband networks. The Korean government did the same. And just recently, the Canadian government boosted by 50 percent its tax incentives for investments in broadband, Internet, and other data network infrastructure equipment.

Second, Congress should extend the current temporary moratorium on federal, state, and local broadband-specific taxes and make it contingent upon broadband providers providing the level of open, best-efforts Internet service as defined by the FCC. Taxing broadband is a bit akin to our national policy regarding smoking: we want people to smoke less but we subsidize tobacco farmers to grow tobacco. In the case of broadband, we want people to use more and faster broadband, but we sometimes tax them when they do. (By contrast, some countries, like Austria and Sweden, have even allowed individual consumers to deduct broadband expenses from their taxes.)

Both of these incentives — first year expensing and a broadband tax moratorium — would be linked to the behavior of broadband companies. To be able to sell untaxed broadband (and to market it as “broadband”), providers would have to offer a best-efforts, open Internet data pipe to their customers in line with the FCC definition. To avoid having to pay broadband taxes, the companies would need to continue to expand their open broadband pipe to meet the evolving FCC definition.

Thinking Ahead

The Internet has evolved over time and will continue to do so. To say, as the New York Times did in an editorial, that charging for higher quality-of-service assurances would endanger the democratic character of the Internet is a considerable overstatement. As the Washington Post stated in its own editorial on the subject, the Internet is a very democratic medium, but not one without advantages for the major players. Nonetheless, there is a reasonable concern that the changing nature of the Internet could threaten the development and deployment of new services and content offerings. Such changes, however, are not necessarily imminent, and the adoption of overly aggressive prophylactic rules could limit the opportunity for broadband providers to capture revenues to support their continuing infrastructure investments, as well as give rise to unintended consequences (such as costly and slow legal proceedings). Only a more focused and carefully tailored regulatory response will ensure that the Internet remains an open platform for innovation and a dynamic medium.

It is worth noting that the concerns that animate the network neutrality debate are in no small part driven by the relative lack of broadband competition and the low levels of available bandwidth in the United States. Unlike some other nations, such as France and Japan, which employed a “line-sharing” model that lets multiple DSL competitors use the incumbent’s infrastructure, the United States pursued a different strategy. The issue of net neutrality is largely moot in these other nations because consumers enjoy both a greater level of competition and more bandwidth than in the United States. In essence, the network neutrality rules now being discussed reflect a short-term solution in the absence of a longer-term imperative: more robust competition in broadband markets and the building of higher speed, best-efforts data pipes.

For a long-term solution, policymakers should focus on promoting the entry of new providers into the broadband marketplace, particularly those using wireless spectrum, and adopting policies to boost the bandwidth of best-efforts broadband connections. And while we await the slow salutary effects of such reforms, policymakers should endorse a sensible approach on net neutrality — one that protects consumers, promotes innovation, and patrols against anti-competitive behavior.

Robert D. Atkinson and Philip J. Weiser, “A Third Way on Network Neutrality,” The New Atlantis, Number 13, Summer 2006, pp. 47-60.

Exhausted by “science says”?

During Covid, The New Atlantis has offered an independent alternative. In this unsettled moment, we need your help to continue.