Home Technology Meta: EU asks Meta to clarify measures against child sexual abuse by December 22

Meta: EU asks Meta to clarify measures against child sexual abuse by December 22

For the last few months, European Union’s (EU) tech regulators have asked social media giant Meta to provide more details on the measures it has adopted to tackle ‘unwanted’ content on its popular photo and video-sharing app Instagram. In October, the European Commissionsent its first request to the Mark Zuckerberg-led company to provide information on measures taken to curb the spread of terrorist and violent content on its social media platform.Last month, the regulator sent another request asking for details about the measures Meta has adopted to protect minors.
Now, the EU watchdog has handed the company a deadline to provide more details on measures it has taken to tackle child sexual abuse material by December 22. EU regulators also noted that if Meta fails to comply with the request it will risk a formal investigation under the new online content regulation rules.
What EU regulators said
In a statement to the news agency Reuters, the European Commission said: “Information is also requested about Instagram’s recommender system and amplification of potentially harmful content.”
EU’s Digital Services Act (DSA) allows its regulators to request such information from tech companies. As per the new law, the EU will ask tech majors to take stricter actions against illegal and harmful content on their platforms. If the companies fail to comply with such requests, it can lead to a formal probe and even fines.

Chinese social media giant ByteDance’s TikTok and the Elon Musk-owned micro-blogging site X have also received similar requests for information from the EU.

In June, a report claimed Instagram’s recommendation algorithms allegedly promote networks of paedophiles. These miscreants were accused of commissioning and selling child sexual abuse content on the popular photo-sharing platform.
If Meta fails to comply with the DSA and remains unsuccessful in checking the spread of child sexual abuse material (CSAM), the company may have to pay a penalty which can scale up to 6% of its global annual turnover.

 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment