Challenging the rules, vaccine accounts thrive on social media

With vaccination against COVID-19 in full swing, social platforms such as Facebook, Instagram and Twitter claim to have intensified their fight against misinformation that aims to undermine trust to vaccines. But problems abound.

For years, the same platforms have allowed it vaccination propaganda will thrive, making it difficult now to eliminate those feelings. And its efforts to eliminate other types of misinformation from COVID-19, often with data controls, information labels, and other restricted measures, have been sadly slow.

Twitter, for example, announced this month which will eliminate dangerous falsehoods about vaccines, just as it does for other conspiracy theories and misinformation related to COVID. But since April 2020, it has removed a grand total of 8,400 tweets spreading COVID-related misinformation, a small fraction of the avalanche of pandemic-related falsehoods, posted daily by popular users with millions of followers, they say. the critics.

“Even if they don’t take action, they lose lives,” said Imran Ahmed, general manager of the Center for Digital Hate Countering, a monitoring group. In December, the nonprofit found that 59 million accounts on all social platforms follow anti-vax propaganda sellers, many of whom are immensely popular disinformation super-broadcasters.

However, efforts to crack down on vaccine misinformation generate cries of censorship and cause some posters to adopt cunning tactics to avoid the ax.

“It’s a difficult situation because we’ve let it go for so long,” said Jeanine Guidry, an assistant professor at Virginia Commonwealth University who studies social media and health information. “People who use social media have been able to share what they want for almost a decade.”

The Associated Press identified more than a dozen Facebook pages and Instagram accounts, which have millions of followers, that have made false claims about the COVID-19 vaccine or discouraged people from taking it. Some of these pages have been around for years.

Of the more than 15 pages identified by NewsGuard, a technology company that analyzes the credibility of websites, about half are still active on Facebook, the AP found.

One such page, The Truth About Cancer, has more than a million Facebook followers after years of posting unfounded suggestions that vaccines could cause autism or damage children’s brains. NewsGuard identified the page in November as a “superdistribution of COVID-19 vaccine misinformation.”

Recently, the page stopped posting about vaccines and coronavirus. He now orders people to sign up for his newsletter and visit his website as a way to avoid alleged “censorship.”

Facebook said it is taking “aggressive measures to combat misinformation in our apps by removing millions of pieces of COVID-19 and vaccine content on Facebook and Instagram during the pandemic.”

“Research shows that one of the best ways to promote vaccine acceptance is by showing people accurate and reliable information, which is why we have connected 2 billion people to health authority resources and launched a global information campaign. “the company said in a statement. .

Facebook also banned ads that discouraged vaccines and said it has added warning tags to more than 167 million pieces of additional COVID-19 content thanks to our network of fact-checking partners. (The Associated Press is one of Facebook’s fact-checking partners).

YouTube, which has generally avoided the same kind of scrutiny as its social media partners, despite being a source of misinformation, said it has removed more than 30,000 videos since October, when it began banning false claims. on COVID-19 vaccines. Since February 2020, it has removed more than 800,000 videos related to dangerous or misleading information about the coronavirus, YouTube spokeswoman Elena Hernandez said.

Prior to the pandemic, however, social media platforms had done little to eliminate misinformation, said Andy Pattison, digital solutions manager at the World Health Organization. In 2019, when a measles outbreak ravaged the Pacific Northwest and left dozens dead in American Samoa, Pattison called on large technology companies to take a closer look at stricter rules on vaccine misinformation. who feared they might make the outbreak worse.

It wasn’t until COVID-19 retaliated that many of these tech companies began to listen to them. It now meets weekly with Facebook, Twitter and YouTube to discuss trends on its platforms and policies to consider.

“When it comes to misinforming vaccines, the most frustrating thing is that this has been around for years,” Pattison said.

The objectives of these repressive measures usually adapt quickly. Some accounts use intentionally misspelled words, such as “vackseen” or “v @ x,” to avoid bans. (Social platforms say they are smart at this.) Other pages use more subtle messages, images, or memes to suggest that vaccines are unsafe or even deadly.

“When you die after the vaccine, you die of everything but the vaccine,” he read a meme on an Instagram account with more than 65,000 followers. The publication suggested that the government conceal the deaths from the COVID-19 vaccine.

“It’s a very fine line between free speech and the erosion of science,” Pattison said. Disinformation providers, he said, “learn the rules and dance around all the time.”

Twitter said it is continuously reviewing its rules in the context of COVID-19 and amending them according to expert guidelines. Earlier this month, he added a strike policy that threatens repeated coronavirus spreaders and disinformation of vaccines with bans.

But blatantly false information from COVID-19 continues to appear. Earlier this month, several articles circulating online claimed that there were more elderly Israelis who took the Pfizer vaccine for “shots” than those who died of COVID-19 itself. One of these articles from an anti-vaccination website was shared almost 12,000 times on Facebook, which led earlier this month to an increase of nearly 40,000 mentions of “vaccine deaths” via social media and the Internet. , according to an analysis by media intelligence firm Zignal Labs.

Medical experts point to a real study showing a strong correlation between vaccination and decreases in severe COVID-19 disease in Israel. The nation’s health ministry said in a statement Thursday that the COVID-19 vaccine has “profoundly” reduced the rate of deaths and hospitalizations.

As the U.S. vaccine supply continues to increase, vaccination efforts will soon shift from a limited supply to the most vulnerable populations to getting as many shots in as many arms as possible. That means addressing a third of the country’s population who say they won’t or probably won’t, as measured by a February AP-NORC poll.

“Vaccination and misinformation of vaccines could be a major barrier to vaccinating enough of the population to end the crisis,” said Lisa Fazio, a psychology professor at Vanderbilt University.

Some health officials and academics generally believe that social platform efforts are worthwhile, at least on the sidelines. What is not clear is to what extent they can cause problems.

“If anyone really believes that the COVID vaccine is harmful and feels the responsibility to share it with friends and family … they will find a way,” Guidry said.

And there are those who still blame the business models who say they encouraged the platforms to report false information about the coronavirus for profit from advertising.

When the Center for Digital Hate Countering recently studied the crossover between different types of misinformation and hate speech, it found that Instagram tends to pollinate misinformation using its algorithm. Instagram could feed an account that followed a QAnon conspiracy site with other posts from, say, white nationalists or anti-vaxxers.

“Continue to allow things to disintegrate because of the perfect mix of misinformation and information on your platforms,” said Ahmed, the centre’s general manager.

.Source