The video-sharing app TikTok is taking steps to curb misinformation on its platform ahead of the 2020 US election in November, according to a statement released on Wednesday.
This announcement comes nearly a week after President Trump told reporters on Air Force One that he is looking to ban the app in the US.
As part of its new initiative to combat misinformation, TikTok will be updating its policies to prevent the spread of misleading content and will now prohibit doctored videos and deepfakes on the platform.
The app will also be expanding its partnerships with fact-checking websites like Lead Stories and PolitiFact to screen misinformation related to the upcoming election.
TikTok has previously used these fact-checking platforms to verify misinformation about climate change and COVID-19.
Furthermore, the content-sharing app has announced plans to work with the Department of Homeland Security’s Countering Foreign Influence Task Force in order to help protect against foreign influence and interference.
In addition to moderating any new content with these updated policies in mind, TikTok will be removing any previously existing videos that violate these rules.
“Misinformation, disinformation, and threats to civic engagement are challenges no platform can ignore,” Vanessa Pappas, the general manager of TikTok’s US operations, said in the statement on Wednesday.
“By working together as an industry with experts and civil society organizations, we can better protect the civic processes that are so essential to our users.”
These new policy changes come just days after Microsoft confirmed plans to purchase TikTok’s US operations from parent company ByteDance by September 15. The deal could cost as much as $30 billion dollars.