YouTube will no longer remove videos falsely claiming the 2020 U.S. presidential election was stolen, reversing a policy put in place in the contentious weeks following the 2020 vote.
The Google-owned video platform said in a blog post that it has taken down "tens of thousands" of videos questioning the integrity of past U.S. presidential elections since it created the policy in December 2020.
But two and a half years later, the company said it "will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past U.S. Presidential elections" because things have changed. It said the decision was "carefully deliberated."
"In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm," YouTube said.
The platform will continue to ban videos misleading voters about when, where, and how to vote, claims that discourage voting, and "content that encourages others to interfere with democratic processes."
YouTube's reversal of its prohibition on false claims about U.S. elections comes as the 2024 campaign is already underway, and former president and current Republican candidate Donald Trump continues to claim, without evidence, that he lost to Joe Biden in 2020 because of widespread fraud.
"YouTube was one of the last major social media platforms to keep in place a policy attempting to curb 2020 election misinformation. Now, it's decided to take the easy way out by giving people like Donald Trump and his enablers free rein to continue to lie without consequence about the 2020 elections," said Julie Millican, vice president of liberal watchdog Media Matters for America. "YouTube and the other platforms that preceded it in weakening their election misinformation policies, like Facebook, have made it clear that one attempted insurrection wasn't enough. They're setting the stage for an encore."
YouTube's policy went further than Facebook and Twitter, which said they would label but not take down false election claims.
Twitter stopped labeling false claims about the 2020 election early last year, saying it had been more than a year since the election was certified and Biden took office.
Facebook has pulled back on its use of labeling, according to a 2022 Washington Post analysis of unfounded election fraud claims on the platform.
Copyright 2023 NPR. To see more, visit https://www.npr.org.
Collected from Minnesota Public Radio News. View original source here.