Existing legislative and regulatory systems are failing to prevent the dissemination and distribution of harmful content online. One possible solution is to create internet super regulators. Here we explore the challenges they face, the compromises they invariably have to make, and an alternative approach.
The problem
Lawmakers are realising that existing regulatory systems do not provide the tools required to deal with the challenges of harmful content online. Efforts to patch old systems have resulted in content safety ecosystems that are complex to navigate and/or full of gaps.
Enter the internet super regulators
There are a wide range of organisations that deliver content via the internet and a wide range of potential harms online. The one thing they all share in common is the use of the internet. One possible option is to cast a big regulatory net over all those organisations and all those issues. This is the role of Internet Super Regulators.
Effective regulators
An effective regulator has to have the right regulatory tools – and clear parameters about how and what it regulates. It needs to have the right people with the right skills to engage with the sector it regulates. It builds relationships with the organisations it oversees and can use a range of tools to ensure compliance and consumer confidence in the sector.
But the internet is not a sector
The internet is the global system of interconnected computer networks that uses the internet protocol suite. The internet is a part of almost everything we do. As a defining characteristic, “the internet” is a category with too many participants and too much complexity to create a manageable sector.
Compromises
The breadth and volume of content included in a super regulators remit will invariably lead to substantial compromises that reduce its ability to improve safety outcomes. One option is to limit regulatory efforts to bigger platforms where higher volumes of content are delivered. Larger platforms are typically already the most engaged in the safety ecosystem and the smaller ones are left to continue to cause havoc.
Regulators may choose to focus on the areas of highest risk (such as child sexual abuse material or counter violent extreme terrorism). Codes of practice provide a logical tool to control large groups of organisations – but they tend to set minimum standards rather than pushing organisations to improve safety systems.
A healthy safety ecosystem
For illegal content (such as CSAM), a decision on whether content should be removed or allowed to remain online is simple. Beyond that, any decision to remove content must be balanced against freedom of speech and other protected rights. Even within the safety field there are competing forces such as privacy, transparency, and security. Recent debates around encryption highlight the tension between two facets of online safety (being law enforcements ability to protect children online and user privacy).
A healthy online safety ecosystem requires those different interests to be continuously represented. As technology evolves – a process of debate and negotiation ensures those various interests are appropriately balanced and represented. By contrast, a super regulator becomes the sole arbiter, and is less likely to evolve and adjust the balances.
News media carve out
Even reputable news media will sometimes publish content online that causes harm and so it is tempting to place it within the remit of the internet super regulator. However, a healthy news media is important to online safety. It is critical in negating mis and disinformation and is a counter balance to poor regulation and legislative decision making – including regulation effecting safety online. Therefore, news really has to be separately regulated – and a huge swathe of content is removed from the internet super regulators remit.
Clumping
Millions of individual people produce harmful content each day but regulators can’t regulate at that scale so they target the platforms that facilitate the delivery of that harmful content. Modern regulatory regimes try to match liability against the degree to which those platforms influence what users see but they tend to clump organisations together into a handful of categories. Within each category are significantly different platforms running systems with equally varying levels of influence on how and what content is delivered to users. To cover such a wide range of platforms, regulation naturally has to accept the lowest common denominators – limiting the pressure it places on the majority of platforms to improve.
Complex systems
A digital society is a complex system. Complex systems are characterized by intricate interdependencies and multiple variables – and require a comprehensive problem-solving strategy that breaks down the complexity into manageable components.
The Super Coordinator
The first thing most online safety ecosystems really needs is better coordination. Rather than jumping straight to focusing regulatory power in one place, I’d recommend concentrating on online safety expertise first. Rather than trying to remove the complexity of a complex system, accept it and set about breaking down the complexity into manageable components. Rather than focus on playing a better game of whack a mole on harmful content, focus energy on looking for the ways you can change the game.
A super coordinator agency would do whatever is required to advance trust and safety in the digital society. Because it is not a regulator, it could work collaboratively with all sectors and stakeholders. It could undertake research, develop codes, recommend legislation, and build online safety infrastructure. It could ensure that all the tools in the online safety toolbox are utilised.