Citing an incident where Comcast apparently degraded its service to a handful of bandwidth-intensive online users to relieve traffic congestion and provide bandwidth to its other customers, a coalition of “public interest” groups and individuals petitioned the FCC to forbid Internet Service Providers (ISPs) from managing congestion on their networks. However, as with the often-cited “tragedy of the commons,” where individuals struggle for their fair share of a finite resource, others maintain that helping a few bandwidth hogs may well harm the vast majority of other online users. This ConsumerGram is the second in a series on whether the ability of ISPs to manage their networks helps or harms consumers.
Context of the Problem
The Internet is a network of private networks linking users to each other and to a variety of content sources using assorted applications. Both its capabilities and the nature of its predominant use have been dramatically transformed in recent years. Different kinds of content (including voice, data, graphics, audio files and video files) require differing amounts of network capacity and users have different “appetites” for different kinds of content. Most Internet users are content to combine sending/receiving a few emails a day with a bit of website surfing for information. Others are more intensive users engaging in file transfers and downloading more bandwidth intensive video and audio files through popular applications like YouTube.
There is a group of users, often referred to as “bandwidth hogs,” who use a disproportionate amount of network capacity. The “Hogs” make up about 5% of all users but use more than half of the networks’ capacity, often to transfer large video files, some of which is copyright protected, with P2P technology. In doing so, they account for more than half of all Internet traffic as measured by bandwidth. These high-volume uses are not charged for this rate of usage. Rather, they pay the same flat fee as every other consumer, benefiting from broadly averaged rates that spread total network costs more or less equally among all users. The result is network congestion that can disrupt and delay all network traffic in the absence of some type of traffic management.
The problem is not without precedent and has arisen in a variety of different contexts. How to resolve the issue is now being debated at the FCC, in Congress, in the press and among countless bloggers where a variety of perspectives on both the problem and solution are offered.
Overuse of Limited Resources
It is now forty years since a phenomenon described at least as far back as Aristotle came to be broadly known as the “Tragedy of the Commons,” an outcome resulting from circumstances in which individuals, acting to maximize their own welfare, impose costs on others. A frequent example of the tragedy is the practice of early settlers in New England of setting aside grazing areas for community members to use at their discretion as sources of food for their livestock. An early example of the potential for economic distress of “All You Can Eat” (AYCE) pricing schemes, the community quickly experienced destruction of the commons due to overgrazing.
The tragedy is the result of a conflict between individual welfare and community welfare. The potential for the tragedy arises where the common good and individuals acting in their own self interest collide. While the term tragedy may in some circumstances be an overstatement, in others it is not and in all cases suggests the potential for serious problems.
The tragedy often arises in the context of use of resources that are public property – waterways, highways, the environment, radio spectrum, public lands, natural resources, for example. But also may arise in disputes over use of private property where demand may exceed supply and, particularly where rules of use are not clearly spelled out and enforced – parking lots, waiting rooms, grocery stores, restaurants, arenas and golf courses. Elements of the tragedy have marked population depletion, including fishing salmon in the North Atlantic, hunting buffalo on the Great Plains, hunting whales on the high seas, harvesting crabs in the Chesapeake Bay and others. Especially vulnerable to the tragedy are network services provided by common user networks, like electricity, sewage, water, telecommunications, roads, waterways, airways, radio waves and, most critically at the moment, the Internet.
Causes of the Tragedy
A conclusive indicator of the threat of the tragedy is the emergence of demand exceeding supply and the emergence of a shortage. In some cases the resource is fixed and nonrenewable, so that no amount of effort on the supply side can increase capacity. In other cases the mismatch is the result of demand growing faster than supply.
The circumstances and causes are straightforward. In the typical case there is a mismatch between private costs and public costs, or so-called externalities. For many goods and services the costs imposed by individual choices are predominantly borne by the individual making the choices. But, increasingly, costs imposed by the individual are borne by others. That is certainly the case here, where in many cases average users, high volume users, and very high volume users pay the same amounts for access.
In simplest terms, the tragedy is the result of failure to ration a scarce, valued resource and instead to rely on the “social conscience” of individuals to counterbalance their innate tendencies to maximize individual welfare. The tragedy may be avoided of course if supply grows faster than demand, so that there is always a bit of excess capacity available for all.
Remedies for the Tragedy
Several remedies are available. Many have been tried. One is simply to ignore the problem and to let it persist. In an imperfect world, not every problem warrants the expenditure of time and resources needed to correct it. The costs may not exceed the benefits of fixing the problem. In the context of the debate over whether to permit network owners to manage traffic congestion, some advocates support that position, without rationalizing beyond general statements of the costs of any remedy.
Advocates of “net neutrality” appear content with the current Internet access price structure, which that resembles a system of tax and subsidy in which some users are granted “free rides” paid for by other users. While such rate structures are considered appropriate in other contexts – subsidies for seniors, low income households, rural users, handicapped users, and others – thus far nobody has claimed that “bandwidth hogs” are a disadvantaged group that should be accorded privileged treatment.
Some have suggested imposing what has been called in other contexts “usage sensitive pricing” (USP), the essence of which constitutes paying for bandwidth used per time period. Under such a scheme, ISPs would calculate a cost per unit of use, measure usage for each subscriber, and then bill each subscriber a fee based on cost per unit of use times the number of units used. USP methods are common in other sectors. We buy gasoline by the gallon; electricity by the KWH; transport by the mile; hamburger by the pound; natural gas by the cubic foot; and, in many cases, telephone service by the minute.
While often favored by some efficiency oriented economists, USF pricing rules are despised by most online consumers who prefer not to have to meter their own usage. Even the efficiency aspects are questioned by some who call attention to: a) the transactions costs associated with metering and billing for each user; and b) the fact that not every minute or unit of bandwidth consumed has the same cost, inasmuch as peak usage by all is more costly than off-peak usage when there is spare capacity available.
A solution in between AYCE and USP is “block rate pricing” which resembles some old utility tariffs in which suppliers offer service blocks containing different amounts of guaranteed usage for successively higher block rates. Many ISPs currently offer a version of “block rate pricing.” Most versions do not, as yet, contain a “Super Premium Block” with cost sensitive rates designed for consumption by the hogs.
If hog demands were “merit” goods like education, vaccinations or national defense, a case could be made for some form of public financing to underwrite capacity growth commensurate with demand. While some net neutrality advocates (and supporters of municipally owned or financed broadband networks) seem to argue something close to that, there is little prospect for building a national consensus for doing so.
Admirers of principles set out by Nobel Laureate, Professor Ronald Coase, might suggest that low volume users who would benefit from throttling back usage by the “Hogs” might be willing to pay them to do so. While total welfare might well be increased by such a scheme, it seems neither an equitable nor politically attractive solution.
It may be that one or more non-price rationing or traffic management techniques are preferred solutions. It is notable that the problem appears in essential respects to be a “peak-load” problem suffered intensively during certain days and day-parts. Indeed, there is substantial overcapacity for most hours of the 168 hours in the week. That suggests that demand shifting techniques might be preferred. A corollary suggests a variety of traffic “shaping” methods commonly used by private network managers in university, corporate or other large institutional settings. The tragedy of the commons does not depend on who owns the network, but rather how its use is managed
A Tragedy of the Commons involving the Internet is not imminent. But, the symptoms are here and clear. There are indications that the demand for bandwidth occasioned by new applications and the preferences of some users for bandwidth intensive content – video in particular – may exacerbate the problem in the future. Experts disagree on the relative rates of demand and supply growth with some forecasting that past trends will continue. Others are less sanguine and support their views of an imminent demand explosion by citing different usage patterns among generations, the prospect for newer, more bandwidth intensive applications, the certainty of an explosion of interconnected devices occasioned by the next generation of televisions, mobile phones, household appliances and others.
Unless and until opponents of reasonable network management practices develop a case for regulation that is based on careful empirical analysis based less on fear, conjecture, imagination and worst case analysis of the behavior of ISPs, the best approach is to permit all management techniques that cannot be shown to be publicly detrimental. Meanwhile, government should diligently enforce consumer protection and anti-trust rules designed to prevent anti-competitive behavior.
To be sure, high-bandwidth users are consumers, too, and deserve fair consideration of their preferences. They are not, however, entitled to burden the rest of us and estop reasonable measures to manage private networks.