Ireland’s media regulator, which oversees the compliance of a raft of tech giants with the EU’s Digital Services Act’s (DSA) general rules, said it is reviewing how major platforms let users report illegal content, following a high number of complaints.
On Thursday, the Coimisiún na Meán (CNM) said one in three DSA complaints it has received since the general rules started to apply in February related to difficulties in reporting illegal content online.
The review is looking at the tools and processes offered by Dropbox, Etsy, LinkedIn, Meta (Facebook and Instagram), Pinterest, Shein, Temu, TikTok, Tumblr, YouTube and X. Another less familiar service, called Hostelworld, is included in the sweep.
The DSA is the European Union’s new online governance and content moderation framework. It seeks to ensure digital services and platforms have effective tools and processes in place to enforce their own rules and act on reports of illegal content, such as terrorism and child sexual abuse material (CSAM).
Penalties for breaches of the regime can reach up to 6% of global annual turnover, so any violations of the rules for reporting illegal content could end up being costly for Big Tech firms.
Content reporting tools and points of contact
The DSA stipulates that platforms must have easy-to-access and user-friendly systems to report problems (Article 16). They must also provide a clear and accessible point of contact for users (Article 12) to raise issues.
Some platforms have already faced issues in both of these areas. X’s compliance with the DSA’s Article 16 has, for example, been under investigation by the European Commission since December. And Meta is currently being investigated by the EU over illegal content reporting. In July, X lost a lawsuit brought by a Dutch citizen that included a complaint about the platform breaching the DSA’s Article 12.
The CNM’s review is significant, as it could lead to wider enforcement on X if the regulator ends up confirming that the platform has broken rules.
The Irish regulator’s review is looking at all the named platforms’ compliance with Article 12. It will also look at compliance with Article 16 for all the named platforms except Meta, Shein and X since the European Commission has already opened DSA investigations/reviews on them.
While the CNM said its review is still at the “information-gathering phase,” the move looks significant given how many major services are involved. It’s also the first wide-ranging DSA action by the regulator.
“An Coimisiún is now initiating a formal review of online platforms’ systems, to ensure that the platforms are complying with their obligations under the EU Digital Services Act (DSA),” it wrote in a press release, adding that the review could lead to “possible formal enforcement action and investigation.”
The Irish media watchdog plays an outsized role in monitoring major tech platforms on account of how many have opted to locate their EU headquarters in the country.
“Once the information gathering phase is complete, Coimisiún na Meán will engage with the platforms to ensure that their reporting mechanisms and point of contacts comply with the requirements of the DSA,” it wrote, adding that it could issue a “compliance notice” directing platforms to address any shortcomings identified.
“If this does not lead to changes and improvements, An Coimisiún can open a formal investigation. If the investigation leads to a finding of non-compliance, Coimisiún na Meán can impose sanctions, such as a fine,” it said.
Niamh Hodnett, online safety commissioner at Coimisiún na Meán, said in the statement: “We are committed to using the full range of powers available under our Online Safety Framework to hold platforms to account for keeping people safe online.
“Through the DSA, our upcoming Online Safety Code and the EU Terrorist Content Online Regulation, we are working towards a digital landscape where adults and children can go online without fear of being harmed by the content or behaviour they come across. When people see illegal content, they should report it to the platform where they saw it and if they aren’t satisfied with the platform’s response, or if they can’t find an easy way to report the content, they should get in touch with us.”