And so it has come to this, the great copyright battle of our time. After a troubled process that has spanned various stages of development, we finally have a proposed compromise text on the Digital Single Market Directive, the latest part of a future directive overhauling copyright for the digital age. After the text was approved by the Parliament last year in a disappointing vote, there was a period of negotiation between the Council, the Commission and the Parliament to try to come up with a final text to put to a vote.
The negotiation was difficult. As we have discussed several times in this blog, some articles have been controversial as they set out obligations that could have nefarious effects to the Internet. Some countries were reluctant to sign on to such proposals, and opposition started to grow in various circles during the process. The impasse was fixed when France and Germany agreed on a common position, and once that had been achieved, the rest fell behind.
There are many concerns about the compromise text, but for now we will concentrate on the dreaded Article 13, which is set to create an upload filtering requirement for intermediaries that will result in a very different Internet experience in Europe. The proposed text of Article 13 has proven to be extremely controversial because it is like trying to kill an ant with a bazooka. While there is a problem with some users infringing copyright online, the proposed solution is to impose restrictions that could have lasting effects on how we experience online content.
The text begins by defining “online content sharing provider”, which will be the subject of the regulation. These are defined like this:
‘online content sharing service provider’ means a provider of an information society service whose main or one of the main purposes is to store and give the public access to a large amount of copyright protected works or other protected subject-matter uploaded by its users which it organises and promotes for profit-making purposes. Providers of services such as not-for profit online encyclopedias, not-for profit educational and scientific repositories, open source software developing and sharing platforms, electronic communication service providers as defined in Directive 2018/1972 establishing the European Communications Code, online marketplaces and business-to business cloud services and cloud services which allow users to upload content for their own use shall not be considered online content sharing service providers within the meaning of this Directive.”
So Art 13 is supposed to cover large online services that allow users to upload content, such as Facebook, Google, Twitter, etc. Some services are expressly excluded, such as auction websites (eBay), online encyclopaedias (Wikipedia), educational repositories, cloud services (Dropbox), and open source repositories (Github). So far so good, it won’t affect small and medium enterprises, right? Not really, the size of the company is not mentioned, just the fact that the service gives public access to “large” amount of copyright protected works for profit. Most online companies start small: Instagram, Twitch, Whatsapp, etc. As long as your startup has an upload capability and a potentially large audience, it could be covered by this definition. I am worried that this will affect small sharing websites (think such platforms as imgur), but most importantly, it will stifle newcomers in Europe.
The article then proposes a substantial change to intermediary liability:
“Member States shall provide that an online content sharing service provider performs an act of communication to the public or an act of making available to the public for the purposes of this directive when it gives the public access to copyright protected works or other protected subject matter uploaded by its users. An online content sharing service provider shall therefore obtain an authorisation from the rightholders referred to in Article 3(1) and (2) of Directive 2001/29/EC, for instance by concluding a licencing agreement, in order to communicate or make available to the public works or other subject matter.”
This is momentous. For those unfamiliar with the intermediary liability regime, the current system contains immunity from liability for content uploaded by users as long as the service provider is not aware that the content may be infringing, which explains the various mechanisms for notice-and-take-down in existence. The above completely changes the way in which this regime operates, and it makes service providers directly liable for the content uploaded by their users. The article suggests that service providers should enter into licensing agreements with content owners, which sounds feasible until one realises that this would mean any content owner, there are not just a few creators out there, we are thinking TV, film, music, photography, anything.
The objective of Art 13 is clear. The copyright industry wants to get a slice of the online revenue pie, and they want tech giants to pay them licensing fees, even when content has been shared by users that “are not acting on a commercial basis or their activity does not generate significant revenues”. If the service provider doesn’t want to pay licensing fees, then they will be liable unless they put steps in place to stop the potential infringement.
The directive doesn’t mention filters, but it clearly means filters. For example, intermediaries have to provide evidence of the following to be exempt from direct liability:
“(a) made best efforts to obtain an authorisation, and
(b) made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information, and in any event
(c) acted expeditiously, upon receiving a sufficiently substantiated notice by the rightholders, to remove from their websites or to disable access to the notified works and subject matters, and made best efforts to prevent their future uploads in accordance with paragraph (b).”
So service providers have to get a licence, or clear authorisation, which is onerous and expensive; failing that they have to ensure the works will be unavailable (again, this means filters, or intrusive and extensive content moderation); and they should have a mechanism for removing content, which is the system in existence now.
What happens if you upload something that is legitimate and it is removed? For example, your work is a parody, or it is being used for educational purposes? Art 13 mentions that these uses shall not be affected, but this is just paying lip service to criticisms that the directive will erode exceptions and limitations. It is very difficult to code a system that filters out content AND also protects exceptions. The system is supposed to allow providers to provide redress for content that has been removed illegitimately, but it is difficult to see how intermediaries will be able to comply with both requirements easily.
Finally, while Art 13 states that it will not impose a monitoring requirement, it is evident that all of the above will require invasive and pervasive monitoring systems to be put in place.
So what is likely to happen?
If Article 13 is adopted as drafted, it’s almost certain that it will have an immediate effect on how the Internet operates. One only needs to see what happened after the adoption of the GDPR to see that we may be faced with further balkanization of the Internet. For many months after the new Data Protection regulation came into being, a seizable number of websites placed restriction to access content from European users. If the DSM became law, it is easy to see that something similar will happen. Internet intermediaries will be faced with these choices:
- Enter into licensing agreements. This will be expensive, time consuming, and it doesn’t ensure immunity from liability, as content could be uploaded that belongs to a copyright owner with which there is no agreement yet. This also implies a cost that will have to be passed on to the consumer, and it could be particularly punishing for smaller firms.
- Create a filtering and monitoring mechanism. This would be inevitable, even if the text never mentions such a system, as it appears to be the only way service providers could continue to operate making best efforts to make uploaded protected works unavailable. There is only one large service provider that has such a system in place, and it is Google with YouTube’s ContentID. Everyone else will have to pay for expensive filtering capabilities.
- Give up. When faced with the two expensive and resource-intensive options above, it is almost certain that quite a few services will just leave European customers to their own devices and start making their services unavailable in Europe. This would appear to be the most logical solution for smaller providers that do not want to worry about the convoluted and complex system that is being proposed, and honestly, who could blame them?
The next step is that the Directive will be discussed at committee level at the European Parliament, and then it will be put to a vote. Hopefully sanity will prevail and Article 13 will be defeated and cast back to the fiery chasm from whence it came.
2 Comments
Mike Palmer · February 25, 2019 at 12:20 pm
Upload filters won’t just be used for copyright infringement. Apparently there’s anti-terror legislation called TERREG on the horizon with similar conditions attached.
News of the Week; February 27, 2019 – Communications Law at Allard Hall · March 4, 2019 at 6:19 am
[…] New EU Directive threatens the Internet as we know it (Andres Guadamuz) […]