By Julia Fioretti and Lily Cusack
(Reuters) – Internet groups such as Facebook <FB.O>, Google’s YouTube <GOOGL.O> and Twitter <TWTR.N> need to do more to stem the proliferation of extremist content on their platforms, the European Commission said after a meeting on Wednesday.
Social media companies have significantly boosted their resources to take down violent and extremist content as soon as possible in response to growing political pressure from European governments, particularly those hit by militant attacks in recent years.
But Julian King, EU security commissioner, said that while a lot of progress had been made, additional efforts were needed.
“We are not there yet. We are two years down the road of this journey: to reach our final destination we now need to speed up our work,” King said in his closing speech at the third meeting of the EU Internet Forum, which brings together the Commission, EU member states, law enforcement and technology companies.
The EU has said it will come forward with legislation next year if it is not satisfied with progress made by tech companies in removing extremist content, while a German online hate speech law comes into effect on Jan. 1.
The Commission is keen to avoid a patchwork of national laws on the issue, and favors a self-regulatory approach.
Over the summer, Microsoft <MSFT.O>, Facebook, Twitter and YouTube formed a global working group to combine their efforts in removing extremist content from their platforms, and last year formed a database of known “terrorist” images and videos which now contains more than 40,000 hashes, or digital signatures.
On Wednesday, the members of the Internet Forum presented a report with updates on progress made by companies and Europol in detecting and removing extremist content online.
The report, reviewed by Reuters, encourages the companies to remove “new and historical terrorist content, and to develop solutions to identify and remove terrorist content within 1-2 hours of upload, to the extent it is technically feasible, without compromising human rights and fundamental freedoms.”
The Commission wants companies to use automatic detection technologies more and act faster on referrals from member states and Europol.
“It is feasible to reduce the time it takes to remove content to a few hours,” said Dimitris Avramopoulos, EU home affairs commissioner.
“There is a lot of room for improvement, for this cooperation to produce even better results, starting with the reporting from the companies, which must become more regular and more transparent.”
Facebook last week reported progress in removing extremist content from its platform, saying it was removing 99 percent of content related to militant groups Islamic State and al Qaeda before being told of it, although it did not say how many items it was removing.
The Counter Extremism Project (CEP), a non-profit organization, said efforts by the companies were encouraging but did not make a strong case for continued self-regulation.
“Instead, what the EU Internet Forum should aim to deliver are concrete, industry-wide policies on blocking or rapidly removing illegal content and consistent enforcement of those policies,” said David Ibsen, Executive Director of the CEP.
(Reporting by Julia Fioretti; Editing by Mark Potter)