EC Tells Platforms to Fight Illegal Content or Face Regulation, Drawing Some Jeers, CCIA Praise
The EU will give online platforms the chance to tackle illegal online content voluntarily but will legislate if necessary, the European Commission said Thursday at a news conference. Its communication presses platforms to shoulder more responsibility for removing terrorist propaganda and racist and hate speech, using automated technology to detect and permanently remove such content. The statement drew criticism from members of the European Parliament, ISPs and digital rights activists. The Computer & Communications Industry Association praised the "welcome initiative."
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
The EC said it will monitor progress by platforms in coming months and assess by May whether additional measures are needed. The commission is prepared to consider legislation if voluntary compliance fails, Security Union Commissioner Julian King told reporters. "What is illegal offline is also illegal online," the EC statement said. The general legal framework for illegal content removal is the e-commerce directive, which harmonizes the rules under which certain platforms are held to be exempt from liability for hosting such content, it said. The EC is committed to retaining that liability regime, it said. Content illegality will be determined by EU and national law, the EC said.
The statement recommends platforms work closely with law enforcement and other agencies to ensure they can be contacted quickly for removal requests. Platforms should have "trusted flaggers," specialized entities with expertise in identifying illegal content, such as Europol's Internet Referral Unit for terrorist content. Criteria based largely on respect for fundamental rights and democratic values could be agreed on at industry level, the EC said. Platforms also should give ordinary users a way to signal illegal content, it said. Google, Facebook and Microsoft didn't immediately respond.
The guidance calls for platforms to take proactive steps to identify and take down illegal content. That doesn't automatically lead to the service provider losing its liability exemption because that safeguard is available only to providers of hosting services that store information for third parties but don't play an active role in controlling, or having knowledge of, the information, it said. The liability benefit applies only to providers that lack actual knowledge of the illegal activity and act quickly to remove it when they become aware of it, it said. CCIA said the 'Good Samaritan' provision ... is a promising step in the effort to tackle infringing content online. Such clarification will help further strengthen the digital sector’s longstanding engagement in this fight.”
The EC wants companies to use automated detection and filtering technologies. They should be transparent about their content policies and what they do when they receive notices of illegal activity and act upon them, it said. To prevent over-removal of information, "it is important to ensure that sufficient safeguards are available so that content which was erroneously removed can be reinstated," it said. Platforms also should ensure that illegal content that has been taken down doesn't reappear.
The guidance brought jeers from stakeholders.
"Automatic filters are not the solution," said MEP Jan Philipp Albrecht, of Germany and the Greens/European Free Alliance. He called for a "coherent EU-wide approach" to how platforms deal with criminal content, saying that upholding 28 national legal systems "is not only absurd" but also gives platforms wiggle room to create their own principles and definitions. MEP Julia Reda, also of Germany and the Greens/ EFA, said the idea that complex questions of internet policy can be solved with automated tools is "misguided and dangerous" and will lead to over-blocking.
The guidelines endorse the trend toward forcing online intermediaries "to play judge, jury and executioner," the European Internet Services Providers Organization said. Any move toward a notice and stay-down regime, a possibility the guidelines leave open, would frustrate fundamental rights, it said. That standards of illegality are defined country-by-country means ISPs are "simply unable to properly assess the context-dependent legality of content," it said.
The guidance focuses on monitoring to remove content internet companies might decide is illegal, but it "presents few safeguards for free speech, and little concern for dealing with content that is actually criminal," said European Digital Rights. The intention is to push companies into adopting the measures "voluntarily," with the threat of imposing them by law, said Ross Anderson, security engineering professor at the University of Cambridge Computer Lab. Most of the measures couldn't be implemented by law because they're illegal, and companies prefer this route because they don't like law, he said. The idea of putting businesses like Google and Facebook in charge of online censorship worries many nongovernmental organizations, he added. CCIA said it's a "welcome initiative for a more aligned approach on the removal of infringing content" across the EU.