Researchers studying Facebook misinformation say they were deformed

The study found that from August 2020 to January 2021, misinformation got six times more clicks on Facebook than posts that contained specific news. Misinformation also accounted for the vast majority of interactions with far-right publications (68%), compared to 36% of far-left publications.

Facebook blocked personal accounts and access to researchers ’sites last month. The company declared its decision to deform the researchers related to a separate study on political ads that involved using a browser extension that allowed users to anonymously share the ads they saw on Facebook with researchers.

“NYU’s Observatory Ad project studied political advertisements through unauthorized means to access and collect Facebook data, in violation of our Terms of Service. We took these actions to stop unauthorized scratching and protect the privacy of people in agreement with our privacy program under the FTC order, “Mike Clark, director of product management, said in a statement.

But Laura Edelson, principal investigator of the Cybersecurity for Democracy project at NYU, said this made no sense because the browser extension is still available and can be downloaded. However, restrictions on their access to Facebook have hampered the group’s investigation into political announcements and misinformation.

“We couldn’t even replicate this study ourselves right now if we tried,” Edelson said Sunday in “Reliable Sources.”

Zoom solves the demand for
Facebook paid a $ 5 billion fine in an agreement with the Federal Trade Commission on privacy and data sharing in 2019.

On Sunday, the company could not be contacted immediately for comment.

In addition, thousands of posts related to the January 6 Capitol attack on CrowdTangle, a popular research tool, disappeared. Edelson said his team reported the error to Facebook, but the platform did not appear to be aware that the posts had disappeared.

“They needed several rounds to recognize the full scope,” Edelson said. “Frankly, I’d worry they wouldn’t understand what’s going on with their own systems.”

Facebook said the bug has been fixed.

The spread of misinformation on Facebook is partisan agnostic, meaning that the platform does not favor or reward falsehoods that come from one side or the other. But the far-right and far-left media ecosystems are “fundamentally different,” Edelson said, with 40 percent of the media sources cited on the far right actively disseminate false information. In other corners of the media ecosystem and other partisan groups, this figure does not exceed 10%.

“This report looks at how people relate to Pages content, which accounts for a small amount of all Facebook content,” Facebook said in a statement.

.Source