By OpenRightsGroup: Theresa May and Emmanuel Macron’s plans to make Internet companies liable for ‘extremist’ content on their platforms are fraught with challenges. They entail automated censorship, risking the removal of unobjectionable content and harming everyone’s right to free expression.
The Government quietly announced Tuesday June 13th that Theresa May and the French President Emmanuel Macron will talk today about making tech companies legally liable if they “fail to remove unacceptable content”. The UK and France would work with tech companies “to develop tools to identify and remove harmful material automatically”.
No one would deny that extremists use mainstream Internet platforms to share content that incites people to hate others and, in some cases, to commit violent acts. Tech companies may well have a role in helping the authorities challenge such propaganda but attempting to close it down is not as straightforward or consequence-free as politicians would like us to believe.
First things first, how would this work? It almost certainly entails the use of algorithms and machine learning to censor content. With this sort of automated takedown process, the companies instruct the algorithms to behave in certain ways. Given the economic and reputational incentives on the companies to avoid fines, it seems highly likely that the companies will go down the route of using hair-trigger, error-prone algorithms that will end up removing unobjectionable content.
May and Macron’s proposal is to identify and remove new extremist content. It is unclear whose rules they want Internet companies to enforce. The Facebook Files showed Facebook’s own policies are to delete a lot of legal but potentially objectionable content, often in a seemingly arbitrary way. Alternatively, if the companies are to enforce UK and French laws on hate speech and so on, that will probably be a lot less censorious than May and Macron are hoping for.
The history of automated content takedown suggests removing extremist content without removing harmless content will be an enormous challenge. The mistakes made by YouTube’s ContentID system that automate takedowns of alleged copyright-infringing content on YouTube are well-documented.
Context is king when it comes to judging content. Will these automated systems really be able to tell the difference between posts that criticise terrorism while using video of terrorists and posts promoting terrorism that use the same video?
There are some that will say this is a small price to pay if it stops the spread of extremist propaganda but it will lead to a framework for censorship that can be used against anything that is perceived as harmful. All of this might result in extremists moving to other platforms to promote their material. But will they actually be less able to communicate?
Questions abound. What incentives will the companies have to get it right? Will there be any safeguards? If so, how transparent will those safeguards be? Will the companies be fined for censoring legal content as well as failing to censor illegal content?
And what about the global picture? Internet companies like Facebook, Twitter and Youtube have a global reach. Will they be expected to create a system that can be used by any national government – even those with poor human rights records? It’s unclear whether May and Macron have thought through whether they are happy for Internet platforms to become an arm of every state that they operate in.
All this of course is in the context of Theresa May entering a new Parliament with a very fragile majority. She will be careful only to bring legislation to Parliament that she is confident of getting through. Opposition in Parliament to these plans is far from guaranteed. In April the Labour MP Yvette Cooper recommended fines for tech companies in a report she headed up on the Home Affairs select committee.
ORG will challenge these proposals both inside and outside Parliament. If you’d like to support our work you can do so by joining ORG. It’s £6 a month and we’ll send you a copy of our fantastic new book when you join.
Do we really want private companies determining what we can and can’t see?
Source Article from http://feedproxy.google.com/~r/blacklistednews/hKxa/~3/HP-bs9m4mtk/M.html
Related posts:
Views: 0