YouTube is waiting up to a month before the election to act on allegations of election fraud



Google-owned YouTube announced on Wednesday it would begin enforcing against this content, citing Tuesday’s safe port deadline for the U.S. election, a date that could not effectively challenge state election results. YouTube said enough states have certified their election results to determine who will be elected president. National news outlets around the world have predicted that Joe Biden will be the next president. YouTube said it was taking videos claiming that the presidential candidate had won the election because of widespread software flaws or errors in counting votes, as an example of banned content. It said it would begin implementing the policy from Wednesday and would “increase” efforts in the coming weeks. Videos that include news context and commentary will allow them to be on stage if they have enough context. Any videos that violate the policy released prior to Wednesday will continue, even if they now violate YouTube’s rules. They will show the information panel that the election results are certified. When asked why YouTube did not implement these policies before or in the run-up to the election, a YouTube spokesman on Tuesday cited the safe port deadline as its rationale. For example, a video claiming that President Trump has won another four years in office and Democrats making unsubstantiated claims of “throwing away Republican votes, reaping fake ballots and delaying decisions to create chaos”. At the time, YouTube said the video did not violate its rules and would not be removed. YouTube’s election-related policies have banned content that misleads people about where and how to vote. The company placed an information panel above the search results related to the election and below the videos talking about the election. The information panel is linked to the “Rumor Control” page of the Cyber ​​Security and Infrastructure Security System to prevent misinformation about Google’s election results feature and election integrity. YouTube also acted to promote content from official news sources in search results. After the election and after that, Twitter (DWDR) was named and banned from sharing tweets containing misinformation, including many of President Trump’s. Facebook (FB) initially only had labeled posts, but later added temporary measures to limit the range of inaccurate posts and added barriers to sharing them. Since September, the company has said it has removed more than 8,000 channels and “thousands” of malicious and misleading videos about the election for violating its rules. More than 77% of those videos were removed before getting 100 views. Separately, Google plans to lift the election-related ban on political advertising from December 10, the company informed advertisers on Wednesday. The technology company said in a letter to CNN’s advertisers that Google would elevate its “important event” position for the US election and restore its consistent policies surrounding election advertising. The ban, announced before the election, was expected to last at least a week following Election Day, but lasted for about a month. Contributed to Brian Fung Reporting of CNN Business.

Source

Leave a Comment