EU Code of Conduct against online hate speech: latest evaluation shows slowdown in progress

The European Commission released the results of its seventh evaluation of the Code of Conduct on countering illegal hate speech online. This year’s results unfortunately show a decrease in companies’ notice-and-action results: the number of notifications reviewed by the companies within 24 hours dropped as compared to the last two monitoring exercises, from 90.4% in 2020, to 81% in 2021, and 64.4% in 2022. TikTok is the only company that improved its time of assessment. The removal rate, at 63.6%, is also considerably lower than at its peak in 2020 (71%). Only YouTube performed better on this parameter than in the last two years. There is, however, a positive development on the companies’ frequency and quality of feedback to users, something which the Commission had been calling on companies to improve in the 2021 report.

The Commission will continue monitoring the implementation of the Code of Conduct. The Commission will support IT companies and trusted flagger organisations in the implementation of the action framework agreed in the context of the Code of Conduct. The Digital Services Act (DSA) entered into force on 16 November. The Act provides comprehensive rules for platforms’ responsibilities and it will also further support co-regulatory frameworks. The Commission will discuss with the IT companies how to ensure that the implementation of the Code supports compliance with the DSA and adds value in the specific areas of tackling hate speech and protecting freedom of expression online. This process may lead to a revision of the Code of Conduct in the course of 2023.

The Digital Services Act includes rules for online intermediary services, which millions of Europeans use every day. The obligations of different online players match their role, size and impact in the online ecosystem. Building on the experience from the Code and its monitoring exercise, obligations related to clear notice and action systems, priority treatment of notices from trusted flaggers, feedback on notices to users and extensive transparency obligations seek to address the identified shortcomings. Specific rules are laid down for very large online platforms reaching more than 45 million users in Europe. These platforms with a systemic role have to assess the risks their systems pose and take mitigating measures to curb the dissemination of illegal content and address other societal risks such as negative effects on fundamental rights or the spread of disinformation. The performance under the Code sets a benchmark for how such platforms tackle illegal hate speech.

Source: https://ec.europa.eu/commission/presscorner/detail/en/ip_22_7109

Latest Posts