AuthorMontagnani, Maria Lilla

Within the recent European policies and actions on illegal content, a trend towards the algorithmic enforcement of content regulation has emerged. Hard and soft law provisions are more or less explicitly requiring online platforms to resort to technological systems in order to comply with the law. The use of technology to enforce the law is certainly not new, especially in the realm of copyright law. The last step in this process is the employment of algorithmic systems to filter content uploaded by third parties and the use of autonomous decision-making to select the content that can appear online. This controversial legislative move raises concerns not only as to its consistency with the current legal framework but also as to its impact on individual rights and societal development in general. This paper proposes a regulatory toolkit for a more balanced algorithmic copyright enforcement that could, hopefully, also provide insights for a better algorithmic society overall.

CONTENTS I. Introduction 3 II. Tackling Illegal Content Online in the DSM Strategy 5 A. An Enhanced Liability Regime for Online Platforms 5 B. A Conditional Liability Regime for Online Intermediaries 8 III. Algorithmic Enforcement in the DSM Strategy 9 A. Misleading Content in the DSM Strategy 12 B. IPRs Infringing Content in the DSM Strategy 13 C. Harmful Content in the DSM Strategy 15 D. Online Terrorist Content in the DSM Strategy 17 IV. Algorithmic Copyright Enforcement in the DSM 19 A. The Early Phase of Algorithmic Copyright Enforcement: The Robo-notice Regime 20 B. The Advanced Phase of Algorithmic Copyright Enforcement: The Voluntary Filtering Regime 22 C. The Current Phase of Algorithmic Copyright Enforcement: Article 17 of the Directive on Copyright in the DSM 24 V. The Beautiful and the Ugly of Algorithmic Copyright Enforcement 28 A. Algorithmic Copyright Enforcement and its Shortcomings 28 B. Proposals to Overcome the Shortcomings of Algorithm Enforcement 34 VI. Towards a Balanced Algorithmic Copyright Enforcement 37 A. The Principle of a Balanced Algorithmic Copyright Enforcement 38 B. Algorithmic Explainability: From Open Record Policies to a Right to Explanation 39 C. A Rights-Based Impact Assessment and a Right to Audit: Safeguarding the Safeguards 43 VII. Conclusion 47 I. INTRODUCTION

The European Digital Single Market Strategy ('DSM Strategy') (1) introduced a strong trend towards algorithmic enforcement of the rules aiming at preventing illegal content online. (2) This is part of a broader movement towards algorithmic content regulation. (3) The trend surfaces from the analysis of the various hard and soft law provisions touching upon online intermediaries' liability that have been adopted by the European institutions to fight the upload and spread online of illegal content. (4) Although the safe harbor regime for online intermediaries set by Directive 2000/31/EC ('e-Commerce Directive') remains untouched, the several instruments adopted point toward a system in which intermediaries are required to implement technological measures not only to take down but also to prevent the (re)appearance of allegedly illegal content online. (5) This determines a shift from a regime in which the law is enforced after a violation of law has taken place (ex-post) to a system where technology ensures that violations do not even occur in the first place (exante). In this way, the technologies implemented to comply with the law get ahead of the threshold of protection and in doing this they change the law itself.

The case of copyright law is probably the most explicit example of algorithmic enforcement and, more specifically, of the shift from an ex-post to an ex-ante system of technological enforcement of the law. Forms of algorithmic enforcement of copyright law were present well before the adoption of the controversial directive on copyright in the DSM ('DSM Directive'). (6) This instrument represents only the last step of a process targeting piracy online that started several years ago. The resulting shortcomings of this piece of legislation are widely known and discussed by many scholars who highlight how a lack of accountability challenges many of the legal principles such as freedom of expression, due process, and the right to self-determination.

In this article, I start from the proposition that the algorithmic society is here to stay to formulate a proposal for balanced algorithmic copyright enforcement. Given the technological developments we are witnessing in terms of autonomous systems and self-learning algorithms, technology will likely continue to provide an increasingly sophisticated compliance tool. If this is the case, we need to be well-equipped to govern the developments to come.

The European Union has already offered a variety of hard and soft law provisions that can add more balance to algorithmic enforcement. However, the many principles, recommendations, suggestions, and tools are not coherently organized--rather, they confirm the piecemeal approach that European institutions often adopt.

This article contributes to the existing debate on algorithmic copyright enforcement by gathering the acquis communautaire, organizing it in the most consistent manner possible, and trying to fill the emerging gaps. As there is no need to reinvent the wheel, this article considers the rules already available, adapts them to the current scenario, and adds the missing links. As a result, a regulatory toolkit is proposed to introduce more balance to algorithmic copyright enforcement. Ultimately, the goal of this toolkit is to provide useful insights for a better algorithmic society.

In the following pages, Section I illustrates the development of European policies targeting illegal content online and depicts how this interacts with the current liability regime for online platforms. Section II turns to the hard and soft law provisions that emerged from these EU policies, which drive towards the adoption of technology as a main tool of compliance. Section III focuses on copyright law as the recent debate surrounding Article 17 of the DSM Directive provides the clearest example of the shift from ex-ante to ex-post algorithmic enforcement. Section IV summarizes the concerns of algorithmic copyright enforcement--more generally, technologies of compliance--and illustrates the solutions so far envisaged. Section V introduces the main features of the regulatory toolkit for balanced algorithmic copyright enforcement. Since algorithms are here to stay, there is the pertinent need to make them comply with the legal framework. Finally, in Section VI I conclude by stressing the need to continue the conversation on the unintended consequences of algorithmic regulation and enforcement.


    Tackling illegal content is one of the priorities of the DSM Strategy as European institutions intend to ensure that that which is illegal offline is also illegal online. This represents one of the many regulatory challenges in the process of creating the Digital Single Market. (7) Naturally, an analysis of the documents articulating the DSM Strategy reveals that tackling illegal content requires the assessment of online intermediaries' role and activity.

    The DSM Strategy places special emphasis on three key pillars: (i) access for consumers and businesses to online goods and services across Europe; (ii) creating the right conditions for digital networks and services to flourish; and (iii) maximizing the growth potential of the digital economy. (8) The first two pillars touch upon the issue of illegal content and more precisely, the intermediaries' liability for illegal content. In this respect, the DSM Strategy clearly states that the "rules on the activities of intermediaries in relation to copyright-protected content" are to be clarified. (9) Similarly, in the pursuit of the second pillar's goal to optimize digital networks and services, the Commission stresses that online platforms' market power can potentially impact other participants in the marketplace. (10) While the level playing field conditions form one concern, the need to guarantee that minors are protected from harmful content and that internet users are protected from hate speech and misleading content form equally relevant considerations. (11)

    This section illustrates the European institutions' policy on tackling illegal content and its impact on platforms' liability. The main argument centers on the proposition that these two fit poorly within the current conditional liability regime as set out by the e-Commerce Directive.

    1. An Enhanced Liability Regime for Online Platforms

      The proposition that tacking illegal content requires a more active role of online intermediaries emerges very clearly in the EU Commission's 2017 Communication on tackling illegal content online. (12) Here, the Commission expressly calls for an enhanced "liability regime" for intermediaries due to the strategic role that they perform in mediating content access to internet users. (13) Moreover, whatever content platforms would tackle, from incitement to terrorism, hate speech, child sexual abuse material to infringements of IPRs, the Commission underlines that online platforms should "adopt effective proactive measures to detect and remove illegal online content." (14) While encouraging the use and development of automatic technologies to prevent the re-appearance of illegal content online, (15) the Communication also recalls the need to comply with the respective fundamental rights and the necessity to implement the safeguards to limit the risk of removing legal online content. (16)

      As a follow-up to the 2017 Communication, in March 2018 the Commission issued a recommendation, which, this time, encompasses more concrete measures to effectively tackle illegal content online. (17) Even though in this document, the Commission has elaborated in some more detail the...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT