It included a series of commitments by Facebook, Twitter, YouTube and Microsoft to combat the spread of such content in Europe. An evaluation carried out by NGOs and public bodies in 24 Member States, released on the first anniversary of the Code of Conduct, shows that the companies have made significant progress in following up on their commitments.
Andrus Ansip, European Commission Vice President for the Digital Single Market, welcomed progress:”Working closely with the private sector and civil society to fight illegal hate speech brings results, and we will redouble our joint efforts. We are now working to ensure closer coordination between the different initiatives and forums that we have launched with online platforms. We will also bring more clarity to notice and action procedures to remove illegal content in an efficient way – while preserving freedom of speech, which is essential.”
Vĕra Jourov, EU Commissioner for Justice, Consumers and Gender Equality, said, “The results of our second evaluation of the Code of Conduct are encouraging. The companies are now removing twice as many cases of illegal hate speech and at a faster rate when compared to six months ago. This is an important step in the right direction and shows that a self-regulatory approach can work, if all actors do their part. At the same time, companies carry a great responsibility and need to make further progress to deliver on all the commitments. For me, it is also important that the IT companies provide better feed-back to those who notified cases of illegal hate speech content.”
The European Union is founded on the values of respect for human dignity, freedom, democracy, equality, rule of law and fundamental rights. The EU and its Member States, together with social media companies and other platforms have a responsibility to act so that the internet does not become a free haven for illegal hate speech and violence.
By signing the Code of Conduct, the IT companies committed in particular to reviewing the majority of valid notifications of illegal hate speech in less than 24 hours and to removing or disabling access to such content, if necessary, on the basis of national laws transposing European law. The Code also underlined the need to further discuss how to promote transparency and encourage counter and alternative narratives.
One year after its adoption, the Code of Conduct on countering illegal hate speech online has deliveredsome important progress, while some challenges remain:
- On average,in 59% of the cases, the IT companies responded to notifications concerning illegal hate speech by removing the content. This is more than twice the level of 28% that was recorded six months earlier.
- The amount of notifications reviewed within 24 hours improved from 40% to 51% in the same six months period. Facebook is however the only company that fully achieves the target of reviewing the majority of notifications within the day.
- As compared with the situation six months ago the IT companies have become better at treating notifications coming from citizens in the same way as those coming from organisations which use trusted reporters channels. Still, some differences persist and the overall removal rates remain lower when a notification originates from the public.
- Finally, the monitoring showed that while Facebook sends systematic feedback to users on how their notifications have been assessed, practices differed considerably among the IT companies. Quality of feedback motivating the decision is an area where further progress can be made.
Improvements in the handling of complaints from users and cooperation with civil society
Within the last year, the IT companies have strengthened their reporting systems and made it easier to report hate speech. They have trained their staff and they have increased their cooperation with civil society. The implementation of the Code of Conduct has strengthened and enlarged the IT companies’ network of trusted flaggers throughout Europe.
The increased cooperation with civil society organisations has led to a higher quality of notifications, more effective handling times and better results in terms of reactions to the notifications.
The Commission will continue to monitor the implementation of the Code of conduct with the help of civil society organisations. Improvements are expected by IT companies in particular on transparency of the criteria for analysing flagged content and feedback to users.
The Commission will take the results of this evaluation into account as part of the work announced in its mid-term review on the implementation of the Digital Single Market Strategy. The Commission will also continue its work to promote more efficient cooperation between the IT companies and national authorities.
The Framework Decision  on Combatting Racism and Xenophobia criminalises the public incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin. Hate speech as defined in this Framework Decision is a criminal offence also when it occurs in the online world.
A recent European survey showed that 75% of those following or participating in debates online had come across episodes of abuse, threat or hate speech. Almost half of these respondents said that this deterred them from engaging in online discussions.
The EU, its Member States, together with social media companies and other platforms, all share a collective responsibility to promote and facilitate freedom of expression throughout the online world. At the same time, all of these actors have a responsibility to ensure that the internet does not become a free haven for violence and hatred.
To respond to the increased problem of illegal hate speech in the online world, the European Commission and four major IT companies (Facebook, Microsoft, Twitter and YouTube) presented a Code of conduct on countering illegal hate speech online”on the 31 May 2016.On 7 December 2016 the Commission presented the results of a first monitoring exercise to evaluate the implementation of this code of conduct.
The mid-term review on the implementation of the Digital Single Market Strategy issued on 10 May 2017 confirmed the need to continue working towards minimum procedural requirements for the notice and action’ procedures of online intermediaries, including as concerns quality criteria for notices, counter-notice procedures, reporting obligations, third-party consultation mechanisms and dispute resolution systems. In the same vein, the Commission’s proposal for a revision of the Audiovisual Media Services Directive contains strong provisions to oblige platforms to set in place a flagging system for audiovisual material containing hate speech online.
The Commission has set up several dialogues with online platforms within the Digital Single Market (e.g. EU Internet Forum, Code of Conduct on illegal online hate speech, and Memorandum of Understanding on the Sale of Counterfeit Goods over the Internet) and plans to coordinate these in a more efficient way to ensure the best possible results.
These IT companies are also members of the “Alliance to better protect minors online” , a multi-stakeholder platform facilitated by the European Commission to provide a better and safer digital environment to tackle harmful content and behaviour.
These efforts, initiated by the Commission, also contribute to the action of G7 leaders  who have recently committed to supporting industry efforts and increasing engagement with civil society to combat online extremism.