Technology is neither good or bad. It’s also not neutral. Network neutrality, a political rallying cry meant to motivate free-speech, free-culture, and innovation advocates, was reportedly betrayed by Google following the release of a Verizon-Google policy document on network management/neutrality. What the document reveals is that the two corporations, facing a (seemingly) impotent FCC, have gotten the ball rolling by suggesting a set of policies that the FCC could use in developing a network neutrality framework. Unfortunately, there has been little even-handed analysis of this document from the advocates of network neutrality; instead we have witnessed vitriol and over-the-top rhetoric. This is disappointing. While sensational headlines attract readers, they do little to actually inform the public about network neutrality in a detailed, granular, reasonable fashion. Verizon-Google have provided advocates with an opportunity to pointedly articulate their views while the public is watching, and this is not an opportunity that should be squandered with bitter and unproductive criticism.
I’m intending this to be the first of a few posts on network neutrality. In this post, I exclusively work through the principles suggested by Verizon-Google. In this first, and probationary, analysis I will draw on existing American regulatory language and lessons that might be drawn from the Canadian experience surrounding network management. My overall feel of the document published by Verizon-Google is that, in many ways, it’s very conservative insofar as it adheres to dominant North American regulatory approaches. My key suggestion is that instead of rejecting the principles laid out in their entirety we should carefully consider each in turn. During my examination, I hope to identify what principles and/or their elements could be usefully taken up into a government-backed regulatory framework that recognizes the technical, social, and economic potentials of America’s broadband networks.
Before jumping into my discussion of the proposed Verizon-Google principles, I want to provide some background to the network neutrality discussion underway in the US. This background will, ideally, introduce newcomers to the discussion of net neutrality with a basic understanding of political lay of the land that preceded the Verizon-Google policy framework. I want to make clear that I’m not providing a fully comprehensive contextualization, but a basic outline to assist you in placing the policy framework in relation to ongoing processes.
Since a federal appeals court ruled against the FCC in their case against Comcast’s usage of deep packet inspection equipment, the American telecommunications regulator has been struggling. After its defeat in court, the FCC quickly announced its ‘third way’. This is an effort to realign how broadband carriers are regulated in the US. The carriers are presently classified as ‘information services’ instead of ‘telecommunications services’, which limits the FCC’s ability to adjudicate how ISPs actually manage their services. To draw ‘information services’ more significantly into the FCC’s regulatory fold, Chairman Julius Genachowski has proposed that the transmission of broadband Internet access is a telecommunications service, though the actual content that is transmitted is outside of the FCC’s purview. The third way has been incredibly poorly received by major telecommunications carriers and had, in part, been responsible for closed-door meetings between the FCC and net neutrality stakeholders. These meetings were meant to establish a regulatory framework that met network neutrality principles while moderating FCC regulation.
Many of the folks involved in network neutrality are the same people deeply invested in the copyfights of the past decade; Lessig, the EFF, CDT, and similar groups have witnessed the negative consequences of industry-driven back room dealings for copyright extension.  While some public interest groups attended the closed-door network neutrality meetings, their involvement was, reputedly, fairly minor. Hopefully as time goes on, more light will be shed on the actual suggestions and compromises proposed in these meetings between public advocates, their corporate counterparts, and the FCC staff in attendance.
While the FCC-driven meetings were ongoing, Verizon and Google had their own private negotiations on what a national broadband policy might look like. This policy was published August 9, 2010 after a weekend of rumors; Edward Wyatt at the New York Times broke a story (“Google and Verizon Near Deal on Web Pay Tiers“) suggesting that Google would pay for ‘special carriage’ on Verizon’s network, and in return Google’s services would faster than those of their competitor. This fee-for-carriage suggestion was denounced by Google, but may have led to a premature release of the Verizon-Google policy document we have today.
As a policy position paper, the document by Verizon-Google has been incredibly effective in energizing discussion around network practices and reinvigorating the discussion in the public eye. The actual framework that was released is helpful, insofar as there are some decent elements, but clearly it needs revision.
For the rest of this post, I will be performing brief and tentative analyses of each principle of the Verizon-Google document. This will often see me refer to prior FCC policies, Canadian regulatory decisions, and academic works around network management and power relations. It’s not intended to be fully comprehensive, but an early effort to collect my thoughts. If you don’t have the time, or desire, to read through these analyses in detail feel free to jump to the end where I’ve tried to briefly summarize my positions. You’ll lose some of the context of the argument, but should leave with a working understanding of my present positions on each principle. I’ll state up front that I’m neither entirely opposed, nor entirely in favour of what Verizon-Google have provided; I’m most interested in picking up their ‘homework assignment’ (as described by Vint Cerf) and playing with the results rather than trying to independently assert a set of principles around network neutrality.
Principle One: Consumer Protections
A broadband Internet access service provider would be prohibited from preventing users of its broadband Internet access service from: sending and receiving lawful content of their choice; running lawful applications and using lawful services of their choice; and connecting their choice of legal devices that do not harm the network or service, facilitate theft of service, or harm other users of the service. There have been serious concerns about the focus on ‘lawful’ in this principle, as there should be. Does this mean that service providers would be justified in throttling, blocking, or otherwise degrading delivery of ‘unlawful content’? How would the differentiation between lawful and non-lawful content types be identified? What constitutes a lawful application and service; is this a reference to some kind of sanctioned and non-sanctioned set of application protocols?
There are considerable concerns around the integration of ‘unlawful’ throughout this principle. Specifically, there are worries that this could lead to systematic blocking of ‘bad’ content and applications. Rather than (exclusively) directing vehement anger towards the corporate giants that have included this in their framework, however, perhaps we should consider the source of this principle. In FCC 05-151, approved in 2005, the FCC outlined the four ‘Internet freedoms’. In principle one, the Commission adopts the principle;
To encourage broadband deployment and preserve and promote the open and interconnected nature of the public Internet, consumers are entitled to access the lawful Internet content of their choice.
This first principle, as written by the FCC, recognizes that consumers are only entitled to access lawful content. The addition in the Verizon-Google proposal is to extend ‘content’ to applications and services as well. Per Carterfone, consumers can attach devices, and make use of the network, so long as the attachments and uses do not damage the network itself. The language “any lawful device” in the Carterfone decision permits the attachment of answering machines, fax machines, and modems to the network at the ends. Applying the principle of charity, I presume that including the language ‘services and applications’ in the Verizon-Google document is intended to clarify the rules laid down in Carterfone. A serious concern, however, is that neither the FCC nor the Verizon-Google policy framework extend the lessons of Carterfone to wireless networks; principle six of the Verizon-Google framework is an attempt to forebear regulation of wireless networks and the FCC has historically been unwilling to extend Carterfone to wireless Voice over Internet Protocol (VoIP) services. Thus, the policy framework issued August 9, 2010 can be seen as integrating the FCC’s already existing position into the corporate-created document.
What can we take away from this principle then? I would suggest that the principle is conservative, insofar as it closely adheres to earlier regulations set forth by the FCC. While we can continue to be worried about ‘lawful content’ in an era where network surveillance practices might be deployed to discriminate between lawful and unlawful content, and ‘harmless’ versus ‘harmful’ application types, the principle established by Verizon-Google isn’t itself pushing the bar very far. Concerns around this principle speak to already existing worries and concerns around network management, concerns derived from existing FCC policies. While there is good reason to be involved in a discussion about ‘lawful content’ and ‘lawful applications’, we need to remind ourselves that this isn’t a novel form of language being assumed by Verizon-Google.
Principle Two: Non-Discrimination Requirement
In providing broadband Internet access service, a provider would be prohibited from engaging in undue discrimination against any lawful Internet content, application, or service in a manner that causes meaningful harm to competition or to users. Prioritization of Internet traffic would be presumed inconsistent with the non-discrimination standard, but the presumption could be rebutted.
Attention must be paid to the phrase ‘meaningful harm to competition or to users’. Adding small amounts of delay to content delivery times can seriously impact the likelihood that users will use a service and/or continue to receive content from the ‘slow’ source. Not only can this potentially cause visitors to never return to your product/site – perhaps instead going to fast products and services provided by the ISP that are guaranteed to be fast – but in the case of websites can impact your visibility via lower Google Pagerank ratings. Slow speeds can have real economic impacts.
The Canadian network neutrality/traffic management hearings included language bordering what is included in the Verizon-Google principle. Specifically, when writing about delaying or slowing down Internet traffic, the CRTC notes (n.126-127) that;
… use of an ITMP [Internet Traffic Management Practice] resulting in the noticeable degradation of time-sensitive Internet traffic will require prior Commission approval under section 36 of the Act.
With respect to non-time-sensitive traffic, the Commission considers that the use of ITMPs that delay such traffic does not require approval under section 36 of the Act. However, the Commission is of the view that non-time-sensitive traffic may be slowed down to such an extent that it amounts to blocking the content and therefore controlling the content and influencing the meaning and purpose. In such a case, section 36 of the Act would be engaged and prior Commission approval would be required.
If we assume that even rudimentary policy learning or interpretation might occur, then the Verizon-Google principle could be read as articulating something resembling what the CRTC has already established. Small-content creators don’t exactly love the CRTC decision, nor even large content creators like the CBC, but adopting something like the Canadian approach would, again, be relatively conservative in the context of North American telecommunications regulation.
Critical commentators are, however, rightfully concerned over the last sentence of the principle. Under what possible conditions could it by non-discriminatory for certain Internet traffic to be prioritized! Wouldn’t such an action add too much ‘intelligence’ to the network, undermining end-to-end arguments?
Perhaps, but not necessarily. At the past two Canadian Telecommunication Summits, pro- and anti-DPI advocates have suggested that a compromise position might be that traffic prioritization is permissible in a network architecture where the user has control over how their own traffic is prioritized. This is a relatively benign approach to traffic management, one that is (arguably) empowering where accompanied by clear user education and accessible user-interfaces. Prioritization is less desired when the telecommunications carrier makes a unilateral decision, without accepting input from the user-base that is substantively drawn into the service providers’ decision-making framework. It is this unilateral decision capacity that has commentators (rightfully) worried; carriers aren’t terribly well known for their active engagement with their customer bases.
While an ideal might be to strip out this last sentence, I almost wonder if having it there is helpful. Carriers have spoken of their prioritization/deprioritization of particular traffic-types; ‘bulk’ traffic is given a lower priority than traffic that is jitter-sensitive. As I understand it, the concern is that particular applications (i.e. Verizon’s own VoIP solution) will be prioritized, rather than a concern that particular application-types (i.e. VoIP in general, which would include both Verizon’s solution, Skype, and other VoIP providers). Perhaps we could ‘simply’ rewrite the sentence in a way to differentiate between application prioritization (bad and not allowed) and application-type prioritization (not necessarily bad, and potentially permissible). Such a distinction would permit prioritization, and were the service provider required to appear before the FCC before implementing the prioritization some ex ante oversight could be performed. Further, such prioritization schemes could be required to come up for independent review periodically. Such reviews would be aimed at preventing new application-types entering the market from being set at a competitive disadvantage on the basis that other application-types receive benefits from packet prioritization.
Principle Three: Transparency
Providers of broadband Internet access service would be required to disclose accurate and relevant information in plain language about the characteristics and capabilities of their offerings, their broadband network management, and other practices necessary for consumers and other users to make informed choices.
The transparency principle, again, is relatively conservative. It parallels the requirements of the Office of the Privacy Commissioner of Canada (OPC) concerning the use of deep packet inspection, where ISPs are required to note how the technology is used in their respective networks, the FCC’s own principle of transparency, and the position on transparency assumed by the CRTC.
In the case of the FCC, their proposed ‘sixth principle’ reads as follows;
…providers of broadband Internet access must be transparent about their network management practices.
Finally, the CRTC has a more detailed account of transparency as it relates to traffic management practices, stating that ISPs must disclose five elements of technical management systems to consumers. Specifically, ISPs must disclose:
- why ITMPs are being introduced;
- who is affected by the ITMP;
- when the Internet management will occur;
- what type of Internet management (e.g. application, class of application, protocol) is subject to management; and
- how the ITMP will affect a user’s Internet experience, including the specific impact on speeds.
This said, while the principle as outlined by Verizon-Google leaves room for improvement, it also extends on the sixth principle established by the FCC. As such, I (again) suggest that this element of the corporate framework is conservative because it hews closely to existing or proposed transparency principles amongst North American regulators.
Principle Four: Network Management
Broadband Internet access service providers are permitted to engage in reasonable network management. Reasonable network management includes any technically sound practice: to reduce or mitigate the effects of congestion on its network; to ensure network security or integrity; to address traffic that is unwanted by or harmful to users, the provider’s network, or the Internet; to ensure service quality to a subscriber; to provide services or capabilities consistent with a consumer’s choices; that is consistent with the technical requirements, standards, or best practices adopted by an independent, widely recognized Internet community governance initiative or standard-setting organization; to prioritize general classes or types of Internet traffic, based on latency; or otherwise to manage the daily operation of its network.
Network management is an interesting issue, and while the principle is ‘conservative’ we should question how it is structured on two grounds. First, with respect to the use of an independent (non-FCC) body to determine best practices and standards, and second in the sense that ‘reasonable network management’ procedures are policy-driven, and less technically oriented. Both of these suggestions are contentious and so I spend a bit of time here in speaking to both points.
Under the Canadian decision, ISPs can manage traffic (i.e engage in network management practices) to ensure network security or protect network integrity. Economic management techniques – those where consumers are billed for excessive usage – are preferred, but technical measures can be deployed in limited fashions as required. The policy principle provided by Verizon-Google captures the issues of security and congestion addressed by the CRTC. Charitably, we can read ‘unwanted traffic’ as referring to email spam, virus laden packets, and other harmful data transmissions coming to, and trying to exit, the ISP network. Such actions are already commonplace amongst many (most? all?) Western ISPs and are helpful because they protect ‘the ends’ from harm while preserving the network’s overall capacities.
The reliance on a standards-setting organization can be read as good – bodies such as the IETF are reputable – or bad – if these bodies are taken to mean American-only ISP/content provider ‘standards’ groups. Concerns have been trumpeted that the latter groups are the referent in this policy principle, but I still haven’t seen actual evidence of that this is, indeed, the referent. A related concern is that, per this principle, were an ISP in compliance with a standards body they would be free from direct FCC regulation. This is true, to a point: at the moment, the FCC’s control over the direct technical capacities of most networks is limited, insofar given that Internet governance bodies are already international groups that (often) escape any particular nation’s all-encompassing sovereign power. Mueller (along with his various colleagues) has written a considerable amount on Internet governance; he has argued that, contra Goldsmith and Wu, nation-states cannot entirely assert their sovereign power in the control of national networks in light of the expanded number of partners in governing global digital networks. Nation-states, and their various institutional organs, can exert considerable influence but not absolute sovereignty over the technical infrastructure of the Internet and expect full integration with the rest of the ‘net.
The concerns about prioritizing particular kinds of content could be problematic, but is equally likely to be helpful. If an ISP actively works to reduce jitter resulting from economically unmanageable congestion then, so long as such prioritization schemas are made public and conform with international best-practices, they can be understood as appropriate, or at least acceptable. Note that this shouldn’t mean that technical measures should permanently be used to manage congestion; shifts to DOCSIS 3.0 and fiber are preferable long-term solutions to managing congestion towards the last mile (where congestion is often most prominent and problematic) but limited technical resolutions may be required as capital expenditures are mobilized to improve the physical network.
From this, I suggest that what Verizon-Google is proposing in this principle is somewhat conservative, and would be entirely conservative if the principle recognized the FCC’s involvement in regulating network management practices. I’ll address a possible division of FCC/international bodies’ responsibilities in a minute, but will ‘tease’ you by stating that granting international bodies ultimate responsibility over the technical elements of network management practices doesn’t necessarily herald the end of the Internet. This statement is made in light of the fact that non-governmental technical bodies already govern various facets of the Internet’s existing infrastructure through the standards setting process.
Before discussing a possible FCC/international bodies division of labor, however, I need to distinguish between the terminology of ‘reasonable network management’ and ‘network management’. I agree with an element of Paul Ohm’s paper that interrogates ISP practices in the US. Ohm identifies reasonable network management as having gained prominence in America following a 2004 speech by Chairman Powell, and the FCC has since adopted reasonable network management as a policy position. While ‘network management’ is a technical issue – Ohm recognizes it as referring “to the activities, methods, procedures, and tools that pertain to the operation, administration, maintenance, and provisioning of networked systems” (1462) – ‘reasonable’ network management is a broader, policy-informed, management apparatus. Specifically, Ohm argues that “it describes not an engineering principle, but a policy conclusion made by weighing the legitimate technological and business goals of network management with what society deems reasonable in light of many principles, including privacy” (1461).
If we accept the division of ‘network management’ and ‘reasonable network management’ as outlined by Ohm, then there is a concern that standards bodies would, in fact, be incapable of establishing ‘reasonable’ network management standards. They could establish network management standards, but without an insight into the realities of particular ISPs and content providers’ relationships, and the economic models underlying these parties, the international groups would be unable to pointedly provide granular international standards.
In light of this potential difficulty I suggest that the policy and economic factors of ‘reasonable network management’ could be kept entirely within the purview of the FCC, while the technical facets of ‘network management’ could be put under FCC purview on a probationary basis. On this basis, where novel management approaches are used those techniques would be regulated by the FCC until an appropriate international technical body came to a conclusion on whether the novel approach adhered to international best practices. The FCC could engage in a consultation, or related, process to integrate those standards into national policy, which would (effectively) see the FCC engage in policy learning/harmonization in technical issues with the global Internet governance community.
This suggestion creates a ‘two-track’ approach to regulation; one that lets America assert its norms and values in management practices, and another that limits over-exuberant novel management techniques while still enabling a flexible technical networking culture. In sum, the two-track approach would see the US retain national/regional sovereignty over non-technical issues – privacy, economics, free speech and so forth – and permit existing international governance bodies to develop the best practices for a functioning Internet community.
Principle Five: Additional Online Services
A provider that offers a broadband Internet access service complying with the above principles could offer any other additional or differentiated services. Such other services would have to be distinguishable in scope and purpose from broadband Internet access service, but could make use of or access Internet content, applications or services and could include traffic prioritization. The FCC would publish an annual report on the effect of these additional services, and immediately report if it finds at any time that these services threaten the meaningful availability of broadband Internet access services or have been devised or promoted in a manner designed to evade these consumer protections.
The additional services proviso has resulted in considerable worries; would this create a two-track Internet, a ‘public’ and a ‘private’ Internet? Would there be price differentials between the services made available over these two Internets?
I’ve already identified a problem with the prior network management principle; let’s assume that the dual track approach is acceptable and so ISPs are prevented from gaming the system to prioritize and deprioritize traffic in a relatively ad hoc manner. I approach the principle of additional online services in two parts: first, from the point of offering ‘other services’, and second, concerning the FCC’s (lack of) regulatory power enshrined in this principle.
Novel bandwidth provision for specialized services happens right now; if you have IPTV coming into your home then your service provider either has, or soon will, segregate a portion of the bandwidth coming into your home to prioritize your IPTV traffic. Rogers and Shaw, Canadian ISPs, have publicly noted that they differentiate bandwidth in their networks so that certain portions are available to different traffic-types. Bandwidth is already provisioned to guarantee certain services at the expense of others.
The wording, ‘clearly differentiated services’, noted in this principle may see some of that aggregate bandwidth provisioned to provide instant-on services, such as a dedicated secure line to your bank or links to an ISP-hosted home monitoring/security system. Such ‘discrete’ uses of the network are not necessarily bad and, in fact, you can imagine that various consumers would welcome the ability to set priorities on various services or receive ‘specialty’ services that are not available over the top. This said, a very real concern surrounding bandwidth segregation and provisioning can be read through Winseck’s work on “netscapes of power”, where a service provider uses their institutional power to impact content/service availability for economic gains. Such differentiation subtly pushes consumers to the service providers’ own offerings in lieu of ‘slower’ third-party, often over the top, offerings.
The FCC should step in whenever there is a netscape of power manifests. This said, a netscape is not necessarily established through the provision of ISP-specific services; such services can be complementary with non-ISP, over the top, services. In the language of Jonathan Zittrain, the ISP-exclusive feature might be the equivalent of an ‘appliance-use’ of the network that competes with ‘generatively-derived’ web systems. Zittrain worries that appliance-like systems (e.g limited-use hardware/software interactions) threaten the ‘generativity’ of the Internet itself. Generativity is defined as a “system’s capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences” (70) but, so long as the ‘public Internet’ is made unconditionally available, I suggest that the generative Internet can peacefully cohabitate with the appliance-Internet.
Let me introduce an example of an instance where appliance-Internet and generative-Internet are arguably not cohabitating. This lack of successful cohabitation results in what appears as a netscape or power, and indicates the value of establishing clear rules for rebalancing the appliance- and generative-Internet. Many Canadians are excited that Netflix, a streaming video service, is finally coming to Canada. Unfortunately, almost immediately after the service was announced one of Canada’s largest ISPs significantly reduced the monthly data caps available to its users. This will reduce the amount of content that Canadians using that ISP can receive from Netflix, ‘encouraging’ those consumers to use the ISP’s own content systems that do not count towards a monthly data cap. This is an example of a netscape of power because an ISP is creating a soft wall around its provisions and encouraging the use of in-house content provision at the expense of Netflix. Arguably, this is a case where an appliance – cable TV offerings – is at odds with the generative Internet. The appliance/generative balance is potentially skewed in this case.
Given the worrying appearance of the imbalance between appliance/generative bandwidth provisions, a regulator should investigate this scenario, possibly on anti-competition grounds. Recognizing that these (anti)competitive activities happen in a converged marketplace, the FCC could avoid the present Canadian situation by developing a heuristic for determining whether the ‘appliance-Internet’ was being used to limit the possibilities of ‘generative-Internet’. Such a heuristic would permit carriers to provide their ‘clearly differentiated services’ while setting clear conditions on how those services operate in relation to generative Internet offerings. Wherever and whenever a netscape was identified the ISP might be forced to adjust their appliance/generative balance. My attention, here, is that a balance is possible. That ISPs want to offer unique services is not necessarily bad in themselves, but such services must be carefully watched and regulated.
Of course, this assumes that the FCC would have a role in adjudicating the appliance-Internet, and the principle outlined by Verizon-Google attempts to forebear that kind of interference. A report is not the same as regulation; the FCC needs to retain regulatory power to prevent a creation of semi-walled gardens, where consumers can venture out from beyond an ISP’s walls but at significant economic or temporal cost. Thus, while appliance and generative networks can potentially function alongside one another without significant difficulties, regulatory oversight must be retained to ensure that the relationship is acceptable.
Principle Six: Wireless Broadband
Because of the unique technical and operational characteristics of wireless networks, and the competitive and still-developing nature of wireless broadband services, only the transparency principle would apply to wireless broadband at this time. The U.S. Government Accountability Office would report to Congress annually on the continued development and robustness of wireless broadband Internet access services.
Anyone who is surprised to see this principle is either new to network policy discussions or has remained willfully ignorant of the ever-present discussions around regulating wireless. ISPs want to keep regulatory authorities at bay from their markets as long as possible, and this principle is just another articulation of this desire. With this in mind, it’s important to note that regulators have generally been hesitant to get involved in regulating wireless broadband in North America. It was roughly nine months after Canada’s traffic management hearings that wireless was drawn into the wireline management framework. The initial forbearance of regulation on wireless caused considerable concern in Canada – Canadians, like their American counterparts, recognize that wireless is the future of broadband markets – but such forbearance was (relatively) quickly reversed. Any principles established by the FCC that include forbearance on wireless could see the same rapid reversal.
Thus, I would suggest that the Verizon-Google principle is conservative. This isn’t to say that such conservatism is necessarily a good thing – nor it is necessarily indicative that I agree with ISP concerns about spectrum scarcity – but that the conservatism is understandable. Ideally, should a principle resembling the Verizon-Google proposal for the wireless market make its way into a regulatory framework it would include a proviso that the issue of wireless regulation would be taken up again within clearly stated period of time. This might let the FCC conduct its own investigations into how it wants to approach the wireless environment, effectively buying it some breathing room without permanently committing (or being committed to) to wireless forbearance.
Principle Seven: Case-by-Case Enforcement
The FCC would enforce the consumer protection and nondiscrimination requirements through case-by-case adjudication, but would have no rulemaking authority with respect to those provisions. Parties would be encouraged to use non- governmental dispute resolution processes established by independent, widely-recognized Internet community governance initiatives, and the FCC would be directed to give appropriate deference to decisions or advisory opinions of such groups. The FCC could grant injunctive relief for violations of the consumer protection and non-discrimination provisions. The FCC could impose a forfeiture of up to $2,000,000 for knowing violations of the consumer-protection or non-discrimination provisions. The proposed framework would not affect rights or obligations under existing Federal or State laws that generally apply to businesses, and would not create any new private right of action.
Principle seven has been heavily criticized, and rightly so. This said, for all of the problems inherent in maintaining that the FCC must limit their regulation of ISPs, some of the suggestions in this principle could adhere to my earlier division between what the FCC might be responsible for and what international standards bodies might be involved in.
The FCC requires rulemaking authority, a capacity to determine what ‘meaningful harm’ is defined as, and the regulator should have its full set regulatory tools to respond to violations of consumer protection laws. I would note that this is also an area where the FTC’s Bureau of Consumer Protection might get involved, as implicitly recognized in the principle’s last sentence. That the FCC should effectively abandon its roles, and rulemaking in particular, makes much of this principle a non-starter.
Having made this claim, however, the position that the FCC would be directed to “give appropriate deference to decisions or advisory opinions of such [independent, widely-recognized Internet community governance] groups” isn’t necessarily bad. If we adopt the division of responsibilities between the FCC and international bodies that I previously articulated in Principle Four (Network Management), a suitable division of labor might be met. To remind you, this division saw the FCC regulating norms and values governing ISPs’ ‘reasonable’ network management, accompanied by limited regulation in non-standardized technical management processes. Technical deference was given to international groups like the IETF after they established technical standards; such ‘formalized’ standards would then be harmonized with FCC policies concerning appropriate technical management of networks within the US. Under this schema an appropriate balance between international groups and the FCC could be struck.
It is important that any independent governance group is international, given that this prevents America’s service providers from assuming the technical policy reins themselves. Further, by separating the ‘reasonable’ from standardized network management practices we might avoid situations where ‘reasonable network practices’ (i.e. policy and business considerations merged with technical realities of the day) are ingrained into the independent policy standards that emerge. Thus, the position that the FCC gives deference to an “independent, widely-recognized Internet community governance” group could be massaged. Whether such massaging is desired, however, is a question and issue extending beyond my efforts here.
Principle Eight: Regulatory Authority
The FCC would have exclusive authority to oversee broadband Internet access service, but would not have any authority over Internet software applications, content or services. Regulatory authorities would not be permitted to regulate broadband Internet access service.
The FCC’s own third way is an effort to extend the definition of ‘access’ to include the transmission of broadband Internet access as a telecommunications service. Under the third way, this means that where an ISP did the equivalent of slowing down a telephone call (let’s not get started on how ugly metaphors will probably get under a third way approach…) then the FCC could step in whenever such delays meaningfully impact the delivery of the telecommunications service. This, in effect, would apply common carrier provisions to ISP services and enable the FCC to stop ISPs from engaging in either unjust or unreasonable practices towards services and applications. Under the third way, however, the FCC would still be prevented from regulating subscription rates or applying various other Title II regulatory tools.
With this in mind, we can see how Principle Eight is designed to stop the third way in its tracks. As I read it, by stemming what ‘access’ refers to the Verizon-Google framework attempts to circumvent the FCC’s reclassification of broadband providers from ‘pure’ information services to information services with limited common carrier requirements. The principle is incredibly important to Verizon (probably less so for Google) if it is to terminate the third way. Given the FCC’s defeat to Comcast, the third way is essential if the regulator is to gain power over how providers manage their networks. I can see nothing in this principle that should be maintained, save that the FCC should continue to have exclusive authority to oversee broadband Internet access services.
Principle Nine: Broadband Access for Americans
Broadband Internet access would be eligible for Federal universal service fund support to spur deployment in unserved areas and to support programs to encourage broadband adoption by low-income populations. In addition, the FCC would be required to complete intercarrier compensation reform within 12 months. Broadband Internet access service and traffic or services using Internet protocol would be considered exclusively interstate in nature. In general, broadband Internet access service providers would ensure that the service is accessible to and usable by individuals with disabilities.
Adopting a principled approach to using the USF for broadband deployment strikes me as entirely reasonable, and is something that the FCC has been mulling for some time. This said, while carriers often argue that ‘intercarrier compensation reform’ will lead to overall lower broadband and phone rates for end-customers, this isn’t always the case. A concern is that reform will serve to (further) advantage large broadband carriers and (further) disadvantage smaller carriers that often struggle with intercarriage rates. While it might be argued that smaller carriers just have to swallow those rates as the cost of doing business, this translates into disadvantaging (often rural) consumers that may not have access to larger carriers’ networks. Further, the combination of opening up the USF, combined with potentially higher carriage raters, could be leveraged by larger carriers to compete with some rural carriers by rolling out their own networks using USF funds and cutting prices, while simultaneously requiring those same carriers pay out more money for carriage. Should this happen (and I stress that this is a hypothetical) I worry that rural customers would be put in an even worse situation than they often are now.
So, at the end of all of this, what do I think? As stated earlier, many of the principles seem relatively non-problematic and/or conservative in the context of North American telecommunications regulation. Others are deeply concerning. Below are brief summaries of the earlier arguments; they lose some of the nuance, but I think effectively capture my overall position on each principle.
Principle one, addressing consumer protections, doesn’t strike me as ‘dangerous’ as suggested by some when it’s juxtaposed against existing FCC policies around lawful content and applications.
Principle two, speaking to non-discrimination, doesn’t strike me a terribly problematic either. So long as regulatory authority is exercised over the decision to prioritize certain traffic, and that traffic is prioritized based on application- or traffic-type as opposed to particular applications (i.e. prioritize VoIP, not Verizon’s VoIP service) then even the potential to prioritize particular classes of traffic isn’t necessarily harmful.
Principle three, addressing the need for transparency, is entirely acceptable. Ideally the principle would hew to decisions in Canada, where there are rules for what information ISPs must provided and how it is provided, or further improve upon the Canadian requirements. Preferably, information on traffic management would be more prominent than on some Canadian ISPs’ websites, but simply requiring that the information is available is a good step in the right direction.
Principle Four, on the topic of network management, is potentially problematic insofar as it limits FCC oversight. I have suggested that there be a division in what is and isn’t overseen by the FCC, a division reflective of some realities of Internet governance. In short, a two-track system would be established. The FCC would retain regulatory authority over non-technical issues such as privacy, economics, free speech, and so forth, and regulate novel instantiations of network management. It would ultimately harmonize technical management practices with standards established by international governance bodies such as the IETF.
Principle Five, concerning additional online services, has justifiably elicited a considerable degree of concern. I suggest that appliance-Internet services do not inherently endanger the generative-Internet, but that regulatory authority is required to ensure that carriers do not create contemporary netscapes of power. The FCC, as such, requires more than report-writing powers and thus Verizon-Google’s proposed ‘check’ to balance carrier power is insufficient as written by the corporate giants.
Principle Six maintains that the FCC should forebear regulation of the wireless environment. I note that similar language emerged in the Canadian network management proceedings, and that the CRTC shortly thereafter included wireless services in the management framework. As a result, the principle here doesn’t strike me as ‘scary’, insofar as principles can be mediated in the future, but I admit that I hold the following opinion: wireless regulation is critical given that the future of broadband is wireless, and the FCC will have to get involved at some point. Canada has decided that the time for regulation is now, and including a proviso to revisit any forbearance on wireless regulation in the US is necessary should a decision be made to not immediately regulate wireless.
Principle Seven, case-by-case enforcement, needs to be significantly reworked. The FCC needs to retain rulemaking authority. This said, a ‘compromise’ might involve the measure noted under principle four, where there is a distinction between the ‘reasonable’ elements of network management and the technical elements of network management. The former would be exclusively under the jurisdiction of the FCC, and the latter would be largely drawn from international bodies’ proposed best practices and standards.
Principle Eight is designed to stop the third-way; as I read it the principle is an attempt to gut common carriage provisions for information services. Such a provision would be a massive setback for the FCC; this principle needs to be rejected out of hand.
Principle Nine is interesting; using the USF for broadband deployment in under serviced areas is relatively uncontroversial, but when combined with a renegotiation of intercarriage rates (which will likely increase rates for smaller ISPs) there is a risk that larger ISPs will draw on the USF to compete in regions exclusively serviced by smaller ISPs while raising carriage rates. When competition is combined with higher carriage rates the smaller ISPs may be endangered, which could hurt rural consumers. The principle doesn’t necessarily have to be rejected out of hand, but serious thought should go into the combined effects of USF for broadband and (likely) higher intercarriage rates.
As a final note, I want to iterate that while this is an area that I study, I learn more about it every day. What I’ve written are early, probationary thoughts. While I certainly hold the positions articulated in this post, those positions are subject to change with new information. If you disagree with me and/or think that I’ve misunderstood or misread things, please feel free to let me know; I’m actively interested in expanding my knowledge in this sphere of telecommunications policy. Given that this is an area of research I’ll be developing on for the next several months, all input is appreciated.
 I should note that I’m incredibly uncomfortable with the term ‘network neutrality’ for various theoretical reasons. I hope to spell out these theory-based dislike to the term in the future. For the purposes of limiting the expansiveness of this post, I’ve avoided delving into these dislikes here, but such avoidance should not be taken as either agreeing with the premises of the term itself nor with an acceptance of any particular theory or framework of network neutrality.
 For a spectacular reveal of how copyright law is traditionally drafted in the US, see “The Art of Making Copyright Laws” and “Copyright and Compromise” in Litman’s Digital Copyright.
 For more, see Cowhey and Mueller. (2009). “Delegation, Networks, and Internet Governance” in Networked Politics: Agency, Power, and Governance (ed. Kahler). See also, Mueller. (2002). Ruling the Root and Bendrath and Mueller. (2010). “The End of the Net as We Know It? Deep Packet Inspection and Internet Governance” via SSRN.
 Winseck. (2003). “Netscapes of power: convergence, network design, walled gardens, and other strategies of control in the information age” in Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination (ed Lyon).
 For the full argument, see Zittrain. (2008). The End of the Internet – And How to Stop It.
 I admit to being taken by Cooper’s (2010) position paper entitled “The Myth of Spectrum Scarcity: Why Shuffling Existing Spectrum Among Users Will Not Solve America’s Wireless Broadband Challenge“.
Latest posts by Christopher Parsons (see all)
- It’s Time for BlackBerry to Come Clean - September 25, 2014
- A Crisis of Accountability — The Canadian Situation - June 10, 2014
- Canadian Cyberbullying Legislation Threatens to Further Legitimize Malware Sales - June 4, 2014