Facebook announced that it will stop running its child abuse detection tools on content communicated in the EU via the company's messaging app due to a new EU regulation that introduces uncertainty as to the legality of such tools.

The new amendment to the European Electronic Communication Code redefines the term "electronic communications services" to include messaging services, such as Facebook Messenger. The EU e-Privacy Directive, which relies on the Electronic Communication Code's definition of "electronic communication services" forbids operators of such services from processing users' content or traffic data, even if the purpose of the content processing is detecting child sexual abuse, on account of the confidentiality owed to telecom subscribers.

While the EU Council was aware of the lacuna that the new amendment creates, it was unsuccessful in adopting a temporary exception allowing to block child abuse content before the amendment came into force on December 21, 2020. The Council has announced instead that it will propose legislation to tackle online sexual abuse of children by the second quarter of 2021.

Other companies such as Microsoft, Google, and LinkedIn announced that they will not cease their voluntary efforts to detect and remove child abuse content despite the amendment, because they believe that this is the responsible approach to take until the EU legislature resolves the issue.

CLICK HERE to read the EU Council's press release regarding the amendment.

CLICK HERE to read Microsoft's announcement.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.