European regulators are inquiring into child safety concerns at Meta

Meta apps on iPhone

Meta has been issued another official request for information from European Union regulators, urging the company to provide additional details regarding its actions in addressing child safety concerns on Instagram. This includes specific information on how Meta is addressing risks associated with the sharing of self-generated child sexual abuse material (SG-CSAM) on the social network.

This request is part of the Digital Services Act (DSA), the European Union’s recently revamped online regulatory framework. The DSA, which came into effect for larger platforms, including Instagram, in late August, is the basis for this inquiry. The Digital Services Act imposes responsibilities on major tech companies to address illegal content, requiring them to implement measures and safeguards to prevent the misuse of their services. Given the regulation’s significant emphasis on the protection of minors, it is unsurprising that several initial Requests for Information (RFIs) from the European Commission are centred around child safety concerns.

The recent request from the Commission to Meta comes after a report by the WSJ indicating that Instagram is facing challenges in addressing a Child Sexual Abuse Material (CSAM) issue that came to light earlier this summer. The report revealed that Instagram’s algorithms were linking a network of accounts involved in the creation, purchase, and trade of underage-sex content.

Several months after the initial revelation, a new report by the WSJ alleges that Meta has not successfully addressed the identified issues. Despite establishing a child safety task force with the aim of preventing “its own systems from enabling and even promoting a vast network of pedophile accounts,” as described by the newspaper, the reported problems persist.

Meta’s inconsistent efforts in addressing the sharing of illegal Child Sexual Abuse Material (CSAM) and self-generated CSAM (SG-CSAM), coupled with ineffective action on associated child safety risks, may result in significant financial consequences for the company in the European Union. Under the Digital Services Act (DSA), the Commission has the authority to impose fines of up to 6% of the company’s global annual turnover if it determines that the regulations’ rules have been violated.

This marks the third Request for Information (RFI) that Meta has received since the commencement of DSA compliance for the company. It is the second RFI specifically focusing on child safety concerns on Instagram. Additionally, the EU has made inquiries into Meta’s handling of content risks related to the Israel-Hamas war and its efforts to ensure election security.

Currently, the EU has not declared any formal investigation proceedings under the DSA.

Leave a Reply

Your email address will not be published.