Popular short video-sharing platform TikTok has removed over 340,000 videos in the US for breaking the platform's rules around misinformation about the 2020 presidential election and the Covid-19 pandemic.
According to a report in Engadget on Wednesday, details of the takedowns were released as part of the company's latest transparency report. The company removed 347,225 videos for sharing election misinformation or manipulated media. An additional 441,000 clips were removed from the app's recommendations because the content was "unsubstantiated," the report said.
During the same period, TikTok took down 51,505 videos for sharing misinformation about Covid-19. In its report, TikTok noted that 87 per cent of these clips were removed within 24 hours of being posted and that 71 per cent had "zero views" at the time they were removed.
The new stats come after TikTok tightened its policies around misinformation ahead of the election.
In the lead-up to the 2020 election, the company introduced new rules barring deepfakes and expanded its work with fact checking organisations to debunk false claims.
The app also added in-app notices to direct users to credible information. In its report, TikTok said it was well-prepared for the election, and that much of the misinformation was from domestic sources within the US.
The company also noted that misinformation and disinformation represents only a fraction of the total content TikTok removes.
Latest Technology News