The European Commission wants to oblige internet companies such as Google, Facebook and Apple to detect and remove images of child sexual abuse. However, according to Brussels, they still do too little of their own accord. The new rules should prevent images from appearing and help track down perpetrators.
The number of reports of photos and videos of child sexual abuse is increasing sharply every year. In 2021, the National Center for Missing and Exploited Children (NCMEC) in the United States collected 29.4 million reports from US providers, representing 85 million photos and videos. That’s a 35 percent increase from 2020. But even that might be the tip of the iceberg because online companies aren’t required to report such content everywhere, and a lot of material remains under the radar.
The European Commission now wants to do something about it. She proposes to oblige providers to detect, report and remove images of sexual abuse. There is already a duty to report child pornography in the US, but the EU is setting a precedent to remove those images from the net.
“The amount of material relating to child sexual abuse circulating online is mind-boggling,” said European Commission Vice-President Margaritis Schinas. “The rules we are proposing provide clear, targeted and proportionate obligations for service providers to detect and remove illegal child sexual abuse content.”
Hilde Vautmans (Open VLD), who focuses on children’s rights in the European Parliament, is pleased that the Commission ‘finally comes up with bold legislation’. She points out that Europe is the most prominent host of child sexual abuse material online. ‘In 2021, no less than 62 percent of all illegal images were on European data servers. Paedophiles today have free rein in our unregulated digital Wild West.’
In 2019 and 2020, 95 percent of all American NCMEC came from Facebook. Last year there were more than 22 million cases in absolute numbers. Instagram and WhatsApp also generated millions of notifications. A European expertise centre for preventing and combating abuse will be established in The Hague to support these companies in their new assignment. It will also be in close contact with the national police forces and Europol.
The tech companies will also need to assess the risk of their services being used as a grooming platform, where perpetrators lure children and young people into abusing them. The national authorities of the EU Member States will monitor whether the companies fulfil this obligation.
The Commission proposal now moves to the Member States and the European Parliament. The Commission is convinced that it has built insufficient guarantees that only the information that is strictly necessary to detect abuse will be used so that the fundamental rights of internet users continue to be protected.
The negotiators will also have to consider a deadline for the entry into force of the regulation. It is intended to replace the temporary derogation from the European ePrivacy Directive, which expires in 2024. With the extension of that Directive two years ago, providers could only apply their detection technology if users agreed. Because the legal backing for tackling child pornography was lost, many reports fell almost immediately. By providing a temporary exception, the European legislators remedied that shortcoming. However, that derogation cannot be intended to expire before the new Regulation enters into force.
On Wednesday, the European Commission also proposed a strategy to make the internet safer for children. It wants to do this by providing them with the skills and tools to better cope in a digital environment, such as cyberbullying. However, it is the Member States that must provide the concrete policy.