Takedowns —

European governments approve controversial new copyright law

Copyright overhaul could effectively mandate automated content filtering.

Exterior of the glass-and-concrete EU parliament building, ringed with flags.

A controversial overhaul of Europe's copyright laws overcame a key hurdle on Wednesday as a majority of European governments signaled support for the deal. That sets the stage for a pivotal vote by the European Parliament that's expected to occur in March or April.

Supporters of the legislation portray it as a benign overhaul of copyright that will strengthen anti-piracy efforts. Opponents, on the other hand, warn that its most controversial provision, known as Article 13, could force Internet platforms to adopt draconian filtering technologies. The cost to develop filtering technology could be particularly burdensome for smaller companies, critics say.

Online service providers have struggled to balance free speech and piracy for close to two decades. Faced with this difficult tradeoff, the authors of Article 13 have taken a rainbows-and-unicorns approach, promising stricter copyright enforcement, no wrongful takedowns of legitimate content, and minimal burdens on smaller technology platforms.

But it seems unlikely that any law can achieve all of these objectives simultaneously. And digital-rights groups suspect that users will wind up getting burned—both due to wrongful takedowns of legitimate content and because the burdens of mandatory filtering will make it harder to start a new online hosting service.

The law could hurt smaller online content platforms

For almost two decades, copyright law in both the United States and Europe has maintained an uneasy standoff between rights holders and major technology platforms. Online platforms were shielded from liability for infringing content uploaded without their knowledge, provided that they promptly remove infringing content once they became aware of it.

Neither side of the copyright debate has been completely happy with this compromise. On the one hand, digital-rights groups have complained that the rules give platforms an incentive to take down content first and ask questions later. This gives copyright holders broad power to censor other peoples' content.

At the same time, copyright holders complain that the system makes it too difficult to police platforms for infringing content. Platforms have no obligation to proactively filter content submitted by users, and copyright holders say they're forced to play an endless game of whack-a-mole against infringing content.

Article 13 is designed to shift the balance of copyright law more toward rights holders. While the exact text of the current proposal hasn't been published, it's likely similar to a draft that was leaked last week by Pirate Party MEP Julia Reda. That version states that platforms will be liable for user-uploaded content unless they can demonstrate that they "made best efforts" to obtain authorization from copyright holders and have made "best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information."

What this would mean in practice is far from clear, especially since this language would need to be "transposed" into the national laws of more than two dozen EU member countries. But that last requirement seems to be mandating that platforms hosting user-generated content adopt filtering technology akin to YouTube's ContentID system. At a minimum, it would give copyright holders more leverage as they pressure sites to more actively police content that's hosted on their sites.

An obvious issue here is that Google says it has spent over $100 million developing the ContentID system. Google can afford to spend that kind of money, but smaller companies probably can't.

The latest drafts of Article 13 aim to address this objection in a couple of ways. First, it allows the courts to take a number of factors, including the size of a company and its audience, into account when deciding whether a technology company is doing enough to battle piracy. Courts will also be able to consider "the availability of suitable and effective means and their cost for service providers." In other words, if a small technology company can show that it can't afford to build or acquire a system like ContentID, it might not get in trouble for not having one.

The proposal also includes a carveout for companies with less than $10 million in annual turnover. However, that exemption is of little practical use because it only applies for the first three years a company is in business.

The law could mean more bogus takedowns

The law also states, rather optimistically, that "cooperation between online content service providers and rightholders shall not result" in the removal of non-infringing works—including those that are covered by the European equivalents of fair use. Theoretically, users would retain the right to use works for quotation, criticism, review, and parody.

Of course, that's easier said than done. As long-time readers of Ars know, YouTube has received many bogus takedown requests that appear to infringe on users' fair use rights. While YouTube seems to have gotten better at filtering through these over time, incidents keep happening because it's genuinely difficult to tell which uses are fair and which aren't—especially when operating at YouTube's scale. The authors of Article 13 haven't discovered a new way to resolve this tension—they're just demanding that platform owners try harder.

The practical implications of Article 13 depend heavily on how they're implemented. If Article 13 becomes law, its vague text will need to be transposed into detailed regulations in every member country. Then those regulations will need to be interpreted by judges.

If the laws are implemented and interpreted by technology-friendly officials, Article 13 might do little more than codify the modest anti-piracy efforts most large platforms already undertake. Smaller companies might be able to point to that language about the "cost for service providers" and argue that proactive filtering simply isn't affordable for them. In this case, the impact of Article 13 might be fairly restrained.

On the other hand, a harsher interpretation of the law could have a big impact. Larger companies might be forced to adopt more intrusive content-filtering systems. Smaller companies could be forced to waste precious cash building (or licensing) complex and expensive filtering systems. Ironically, this could wind up entrenching the power of existing large platforms—which are mostly based in the United States.

Channel Ars Technica