For Content Regulation to Work, the Regulator Must Be Empowered and Expert

The UK Online Safety Bill properly assigns implementation and enforcement to a regulator.

October 9, 2023
onlinesafetybilil.reduced
(Photo illustration by Jonathan Raa/NurPhoto via REUTERS)

The United Kingdom’s new Online Safety Bill sets rules for regulating online content for companies that provide service to a significant number of UK residents. In many ways the bill parallels the European Union’s Digital Services Act, which was adopted last October, in imposing some procedural and process rules to bring online content disorder under control without infringing on free speech and privacy. Some elements of the new UK law might be usable in other jurisdictions, including Canada and the United States.

The major thing the new law gets right is to assign implementation and enforcement to a regulator, rather than to attempt to write the details of policy in the text of the legislation. The law provides instructions to the regulator to make specific determinations based on policies set in the law itself. But the details of the final balancing of sometimes-conflicting policy goals is assigned to the regulator.

That regulator is the Office of Communications — or, Ofcom — the agency that historically has regulated broadcasting and telecom in the United Kingdom, just as the Federal Communications Commission in the United States, or the Canadian Radio-television and Telecommunications Commission in Canada, has. In preparation, the agency has been hiring hundreds of staff to implement and enforce the law, including some very senior and experienced people.

In principle, I think the ideal digital regulator would be a new agency. In my forthcoming book, Regulating Digital Industries, I argue that the United States should establish a digital regulator empowered to set competition, privacy and content moderation rules for companies operating in core digital industries, which include search, e-commerce, social media, the mobile app infrastructure and advertising technology. Given the political realities in the United States, creating the new agency would likely be done in two stages, with the first being assigning the key digital responsibilities to an existing economy-wide law enforcement agency, the Federal Trade Commission. Ultimately, I think Congress would have to move jurisdiction to a new sectoral regulator to provide needed expertise and focus.

The Canadian government is seeking to put in place “a transparent and accountable regulatory framework for online safety in Canada.” To do so, however, it seeks to establish a new regulator, the Digital Safety Commissioner of Canada, separate from its communications regulator.

But whether it’s a new agency or an existing one, a key element in implementing and enforcing a new regime for online content regulation is to make sure that regulatory agency is empowered and expert.

Another thing the Online Safety Bill gets right is to require the regulated industry to pay the costs of its own regulation. The regulated industry covers search, user-to-user services and pornography sites. It does not include ordinary websites, e-commerce, e-mail or traditional single messaging services that are based on telephone numbers. The covered companies would get a notice from Ofcom saying they fall within the bill’s scope, stating their obligations under the law and informing them of their regulatory fee. In a recent Atlantic Council podcast, Richard Allan, a member of the UK House of Lords, noted that even though it is not explicitly a licensing fee, it would feel like it to the regulated companies. This funding source ensures that the agency will have the needed budget to hire expert staff and to conduct the investigations it needs to enforce the many new requirements in the bill.

A third thing the legislation gets right is strong enforcement, with provisions for fines of up to 10 percent of annual turnover for companies who have not satisfied their obligations under the law. Companies would not be able to treat a slap on the wrist for non-compliance as a cost of doing business.

Whether it’s a new agency or an existing one, a key element in implementing and enforcing a new regime for online content regulation is to make sure that regulatory agency is empowered and expert.

However, the legislation remains controversial in several respects: three such concerns are its provisions surrounding legal but harmful content; age assurance; and encryption. In each case, the bill takes a responsible approach by assigning a key role to the regulator to make expert and balanced policy decisions within the framework set by the law.

Harmful Material

The Online Safety Bill initially required companies to take steps in connection with government-defined harmful speech, including material relating to suicide; deliberate self-injury; eating disorders; or abuse or incitement of hatred toward people because of their race, religion, sex, sexual orientation, disability or gender.

This requirement was supposedly purged in late 2022, but it actually remained in two places: one is an absolute ban on harmful material for children and the other is a requirement to develop tools for adult users to prevent their exposure to harmful material.

People who don’t like a government-defined category of harmful speech still do not like this approach, and it might not work in the United States with its hyper-libertarian free speech rules. Moreover, the approach is less than effective in dealing with the harms of harmful speech. It allows disinformation, hate speech and racism to flourish online, except in the feeds of children and adults who don’t want to see it. As the events of January 6, 2021, showed, we won’t be able to protect ourselves from the harmful effects of online information disorder by putting our heads in the sand and pretending it’s not there.

Given this approach to harmful speech, however, Ofcom is given the right role. It is required to give guidance on what material is harmful, and this action is needed to make the legislative categories operational for companies in scope. Ofcom can also propose additional categories of harmful speech, with final approval by Parliament.

Age Assurance

The law’s special obligations to protect children create a need to know who the children are. Online services intended for adults might restrict their audience to 18-plus in an attempt to avoid these obligations. But even this restriction requires age assurance. Effectively, the law requires age assurance for everyone.

Many mechanisms could be used: biological age estimation; reliance on trusted third parties who know a user’s age; uploading a scan of a government-issued ID such as a passport. Ofcom is given the role of balancing a high level of age assurance with respect for privacy.

This is also an area where it would be well to resort to regulatory experience to resolve a policy issue beyond the expertise of legislators. Ofcom would be well advised to consult with the Information Commissioners Office (ICO), the UK privacy regulator, in making decisions about age assurance. The UK Digital Regulation Cooperation Forum, which brings together Ofcom, ICO and the digital markets unit of the Competition and Markets Authority, is the ideal mechanism for this consultation. But the final decision rests with Ofcom, not the privacy regulator, which might not be ideal. In my book, I argue that a single agency with authority over both digital privacy and online content moderation would be in the best position to make a final judgment that balances the conflicting policy goals.

Encryption

The new law permits Ofcom to require messaging platforms to use “accredited technology” to identify illegal content such as terrorist material or child sexual abuse material. Ofcom would first have to “accredit” the technology in some fashion and also determine that such an order is “necessary and proportionate.” But it would be well within its authority under the law to do this.

This provision appears to be the legislative implementation of a ministerial statement to the House of Lords over the summer that if reasonable and technically feasible solutions do not exist, Ofcom cannot require companies to scan encrypted messages for this illegal content.

Under this provision, Ofcom could apparently mandate, for example, client-side scanning, where images are inspected on user devices before being encrypted and transmitted. Apple tried to introduce client-side scanning in 2021 but withdrew it after heavy privacy criticism. But it appears to be a technically feasible way to have platforms stop the transmission of illegal material.

The controversy over encryption swirled over the summer and has not ended. Meredith Whittaker, president of Signal, a messaging service that uses encryption, threatened that Signal would leave the United Kingdom if the law stopped it from encrypting messages. After the law’s adoption, she said that even having the authority in the hands of Ofcom to possibly mandate client-side scanning sets a “precedent for authoritarian regimes.” Whittaker has reiterated the threat to withdraw Signal from the United Kingdom if Ofcom mandates an end to encryption.

For now, these three concerns are still to be resolved. But they have been put in the right hands, an expert regulator able to assess the technical and policy issues and make an informed call that is well beyond the knowledge and capacity of legislators.

Much now depends on how well Ofcom does its job. If it takes advantage of its mandate to make detailed decisions within parameters set by the legislature and uses its industry-funding mechanism to hire enough capable and expert staff, it might show other jurisdictions how to move the online world away from information disorder while still remaining true to the principles of free expression and privacy.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Mark MacCarthy is an adjunct faculty member at Georgetown University. He is the author of Regulating Digital Industries: How Public Oversight Can Encourage Competition, Protect Privacy and Ensure Free Speech (Brookings, 2023).