As Covid-19 ravages countries like India and Brazil, Facebook said it has removed more than 18 million pieces of content from its main platform and Instagram globally for violating its policies on Covid-related misinformation and harm.
The social network removed these many pieces of content from the the start of the pandemic to April this year. "We're also working to increase vaccine acceptance and combat vaccine misinformation," the company said in its Community Standards Enforcement Report Q1'.
The social media giant Facebook on Wednesday expanded its Covid-19 Announcement -- a tool for the health departments of states and union territories to share essential Coronavirus related updates -- in India.
According to Guy Rosen, VP Integrity at Facebook, prevalence is one of the most useful metrics for understanding how often people see harmful content on its platform. Prevalence of hate speech on Facebook continues to decrease.
"In Q1, it was 0.05-0.06 per cent, or 5 to 6 views per 10,000 views. We evaluate the effectiveness of our enforcement by trying to keep the prevalence of hate speech on our platform as low as possible, while minimising mistakes in the content that we remove," Rosen said.
In Q1, Facebook took action on 8.8 million pieces of bullying and harassment content, up from 6.3 million in Q4 2020. It also took action on 9.8 million pieces of organised hate content, up from 6.4 million in Q4 2020 and 25.2 million pieces of hate speech content, compared to 26.9 million in Q4 2020.
On Instagram in Q1, the company took action on 324,500 pieces of organised hate content, up from 308,000 in Q4 2020, and 6.3 million pieces of hate speech content, compared to 6.6 million in Q4 2020.