Facebook sets up election operations centre to monitor platform abuse
Social media giant Facebook has set up an election operations centre at its EU headquarters in Dublin with the aim of keeping a “vigilant watch” for platform abuse.
Photo credit: Press Association
A forty-strong team, comprising of data scientists, engineers and cyber security officers, will staff the centre and monitor disinformation over the course of the 23-26 May European elections.
All 24 of the European Union’s official languages are represented and staff will be analysing sources around the clock. Facebook has also partnered with 21 fact-checking organisations.
The centre aims to attempt to tackle misinformation, foreign interference and coordinated “inauthentic behaviour.”
- Commission lauds efforts by tech giants to increase transparency
- Parliament calls for action on fake news
- Cambridge Analytica/Facebook scandal ‘a stab in the heart of democracy’, warn MEPs
- Zuckerberg fails to impress with responses to Parliament's follow-up questions
Similar centres have previously been set up in the United States, Brazil and India in an effort to prevent wide-scale election-influencing campaigns.
Facebook has taken the action amid warnings that foreign groups will use social media to manipulate public opinion.
According to a report by SafeGuard Cyber, a U.S.-based cybersecurity firm, more than half of Europeans may have seen some form of disinformation promoted by Russian actors on social networks ahead of the EU-wide election.
SafeGuard Cyber said that some of the social messages could be linked to Russia or Russian interests.
"With specialists from all 28 Member States, the centre consists of teams concentrated in a range of areas including: threat intelligence, data science, software engineering, research, operations, legal, policy and communications"
Tech companies such as Google and Facebook have been told they are not doing enough to stop false reports and that voters may be subject to manipulation.
In the face of growing concerns, Facebook says it is now doing more to regulate content and recently banned the conspiracy theorist Alex Jones and several other divisive figures from its platforms.
The team in Dublin is alerted to questionable content by an automated system that finds problematic content or when there is a surge in the flagging of a piece of content by users.
With specialists from all 28 Member States, the centre consists of teams concentrated in a range of areas including: threat intelligence, data science, software engineering, research, operations, legal, policy and communications.
They will provide language support and country experts will consult on colloquialisms and culture.
The teams are supported by an additional 500 people who work for Facebook full-time on election issues and a further 30,000 who deal with safety and security issues.
Elsewhere, a report by the European Economic and Social Committee (EESC) says “lack of critical media literacy and the spread of mis/disinformation about the EU represent a threat to the trust placed by citizens in the European project and its institutions.”
"Growing populism, radicalism, disinformation and fake news - all of them are aimed at weakening the foundations of the EU. If we really want to see how to solve these problems, we need to go to the roots: to education," said Tatjana Babrauskienė, rapporteur for the opinion.
According to a Eurobarometer poll on fake news and online disinformation, more than a third of respondents (37 percent) say they come across fake news every day or almost every day, and a further 31 percent say that this happens at least once a week.
In every Member State, at least half of respondents say they come across fake news at least once a week.
When asked about the consequences of mis/disinformation, 85 percent of respondents perceive fake news as a problem in their country and 83 percent perceive it as a problem for democracy in general.
Meanwhile, UK ministers will this week announce plans for a crackdown on ‘dark ads’ on social media to prevent interference in elections.
It is thought that all online political adverts will be required to explicitly show who made them.
Commenting on the move, former Scottish MEP Catherine Stihler, now the chief executive of the Open Knowledge Foundation, said, “These reforms must be robust to rebuild voters’ trust in social media and they must deliver full transparency.”
“While the digital world has transformed our lives in a positive way, it has also allowed the spread of disinformation and the dismissal of basic facts. It is imperative that we do not allow fake news to blight this month’s European Parliamentary elections or any future elections.”
“Digital advertising is a key influence on our political world and countries across Europe and the world must recognise the vital need for greater transparency,” Stihler added.
The escalating spate of mass shootings from Christchurch to El Paso has been enabled by the fact that millions of ordinary people now believe in the existence of an Islamist conspiracy to ‘replace...
Funding programmes in minority languages would increase representation of those who have never had an equal voice, writes Leonardo González Dellán.
Interfaith dialogue unlocks moderation, mutual respect and understanding