TikTok will soon start displaying warnings on videos bearing unverified information. The app will also issue a warning to users when they are about to re-share the videos.
The warning will let users know that the videos contains information that could not be verified by fact-checkers.
TikTok will display a warning on the videos in question saying, “Caution: Video flagged for unverified content.” This means that the video went through the fact-checker, but they were not able to ascertain if the info was right or wrong.
A similar warning will be extended to viewers who want to share the flagged videos. A pop-up meant t dissuade viewers from sharing these videos will ask the users if they “are sure they want to share this video?” This will however not stop users from sharing the videos to external apps.
This is the first time that TikTok is publicly flagging content on its app, after reducing the spread of unverified videos earlier. Creators will also be alerted when a warning label is added to their videos, letting them know that the video will not be widely distributed.
TikTok has not shared details regarding its fact-checking process. However, a spokesperson for the company said that the most targeted topics include elections, vaccines and climate change. Videos do not need to go viral before they are flagged.
The spokesperson said that videos that violate the app’s misinformation policy are automatically removed.
TikTok joins a number of social media apps that have taken their fight against the spread of misinformation a notch higher. Facebook warns users before sharing old stories or Covid-19 related info, while Twitter warns users before retweeting potential misinformation.