Facebook has built a War Room to fight election interference
Facebook said that it was planning to set up a task force comprising "hundreds of people" ahead of the 2019 general elections in India.
In line with its efforts to prevent the misuse of its platform during elections, Facebook has set up a War Room to reduce the spread of potentially harmful content.
Facebook faced flak for not doing enough to prevent the spread of misinformation by Russia-linked accounts during the 2016 US presidential election. The social networking giant has rolled out several initiatives to fight fake news and bring more transparency and accountability in its advertising since then.
The launch of the first War Room at its headquarters in Menlo Park, California, is part of the social network's new initiatives to fight election interference on its platform.
Although Facebook opened the doors of the War Room ahead of the general elections in Brazil and mid-term elections in the US, it revealed the details only this week.
The goal behind setting up the War Room was to get the right subject-matter experts from across the company in one place so they can address potential problems identified by its technology in real time and respond quickly.
"The War Room has over two dozen experts from across the company - including from our threat intelligence, data science, software engineering, research, community operations and legal teams," Samidh Chakrabarti, Facebook's Director of Product Management, Civic Engagement, said in a statement on Thursday.
"These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook," Chakrabarti added.
Facebook said its dashboards offer real-time monitoring of key elections issues, such as efforts to prevent people from voting, increases in spam, potential foreign interference, or reports of content that violates our policies.
The War Room team also monitors news coverage and election-related activity across other social networks and traditional media in order to identify what type of content may go viral.
These preparations helped a lot during the first round of Brazil's presidential elections, Facebook claimed.
The social networking giant said its technology detected a false post claiming that Brazil's Election Day had been moved from October 7 to October 8 due to national protests.
While untrue, that message began to go viral. But the team quickly detected the problem, determined that the post violated Facebook's policies, and removed it in under an hour.
"And within two hours, we'd removed other versions of the same fake news post," Chakrabarti said.
The team in the War Room, Facebook said, also helped quickly remove hate speech posts that were designed to whip up violence against people from northeast Brazil after the first round of election results were called.
"The work we are doing in the War Room builds on almost two years of hard work and significant investments, in both people and technology, to improve security on Facebook, including during elections," Chakrabarti said.
Earlier this month Facebook said that it was planning to set up a task force comprising "hundreds of people" ahead of the 2019 general elections in India.
"With the 2019 elections coming, we are pulling together a group of specialists to work together with political parties," Richard Allan, Facebook's Vice President for Global Policy Solutions, told the media in New Delhi.
Facebook has also set a goal of bringing a transparency feature for political ads -- now available in the US and Brazil -- to India by March next year, Allan informed.
With the new ad architecture in place, people would be able to see who paid for a particular political ad.