According to a new report, WhatsApp’s Facebook encrypted messaging service is not as private as it claims.
The popular chat app, which promotes its privacy features, claims that Facebook parent cannot read messages sent between users. But a comprehensive ProPublica report said Tuesday that Facebook is paying more than 1,000 hired workers around the world to read and moderate allegedly private or encrypted WhatsApp messages.
Moreover, according to the information, the company shares certain private data with police agencies, such as the U.S. Department of Justice.
The revelation comes after Facebook chief Mark Zuckerberg has repeatedly said the company does not see WhatsApp messages.
“We don’t see any content on WhatsApp,” the CEO said during testimony before the U.S. Senate in 2018.
Privacy is promoted even when new users sign up for the service, with the app emphasizing that “your messages and calls are protected so that only you and the person you are communicating with can read or listen to them. and no one, not even what. “
“These guarantees are not true,” the ProPublica report said. “WhatsApp has more than 1,000 hired workers who fill floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of users’ content.”

Facebook acknowledged that these contractors spend their days reviewing content that WhatsApp users and the service’s own algorithms mark, and often include everything from fraud and child pornography to possible terrorist conspiracies.
A WhatsApp spokeswoman told The Post: “WhatsApp provides a way to communicate spam or abuse, which includes sharing the most recent messages in a chat. This feature is important to prevent the worst abuses on the Internet. We absolutely do not agree with the idea that accepting reports that a user chooses to send us is incompatible with end-to-end encryption. ”
According to the WhatsApps FAQ page, when a user reports abuse, WhatsApp moderators receive “the most recent messages sent to you by the reported user or group.” ProPublica explained that because WhatsApp messages are encrypted, artificial intelligence systems “cannot automatically scan all chats, images and videos, as they do on Facebook and Instagram.”
Instead, the report revealed that WhatsApp moderators access private content when users press the app’s “report” button, identifying a message that allegedly violates the platform’s terms and conditions.

This forwards five messages, including the alleged infringer, along with the previous four from the exchange – more images or videos – to WhatsApp freely, according to former WhatsApp engineers and moderators, who spoke to ProPublica.
Aside from messages, workers see other unencrypted information, such as names and profile pictures of a user’s WhatsApp groups, as well as their phone number, profile picture status message, level phone battery, language, and related Facebook and Instagram accounts.
Each reviewer handles more than 600 complaints a day, giving them less than a minute per case. Reviewers can do nothing, put the user on a “watch” for broader control or ban the account.
ProPublica said WhatsApp shares unencrypted metadata or records that can reveal many things about a user’s online activity, with law enforcement agencies like the Department of Justice.

The media claimed that WhatsApp user data helped prosecutors build a high-profile case against a Treasury Department employee who leaked confidential documents to BuzzFeed News exposing how gross money allegedly flows through northern banks -americans.
Like other social networking platforms, WhatsApp is trapped among users who expect privacy and police agencies to demand that these platforms deliver information that will help fight online crime and abuse.
WhatsApp CEO Will Cathcart said in a recent interview that there is no conflict of interest.
“I think we can have security and safety for people through end-to-end encryption and work with law enforcement to solve crimes,” Cathcart said in a YouTube interview with an Australian think tank in July .
But the privacy issue is not that simple. Since Facebook bought WhatsApp in 2014 for $ 19 billion, Zuckerberg has repeatedly assured users that it would keep the data private. Since then, the company has gone a long way in terms of privacy and monetization of the data it collects from users of the free messaging app.
In 2016, WhatsApp reported that it would start sharing user data with Facebook, a move that would allow it to generate revenue. The plan included sharing information such as user phone numbers, profile photos, status messages, and IPOs, so Facebook could offer better friend suggestions and more relevant ads, among other things.
These actions put Facebook on the radar of regulators, and in May 2017, European Union antitrust regulators fined the company $ 122 million for falsely claiming three years earlier that it would be impossible to link user information between WhatsApp and the Facebook family of apps. Facebook said its false statements in 2014 were unintentional, but did not challenge the fine.

Facebook continued to be targeted for security and privacy issues over time. In July 2019, this culminated in a $ 5 billion fine by the Federal Trade Commission for violating a previous agreement to protect users ’privacy.
The fine was nearly 20 times greater than any other previous privacy-related sanction, the FTC said at the time, and Facebook’s violations included “misleading users about their ability to control the privacy of their personal information.” “.
Regardless, WhatsApp continues to try to find a way to make money while monitoring privacy. In 2019, the app announced that it would run ads inside the app, but those controversial plans were abandoned days before the ads were released.
Earlier this year, WhatsApp announced a change in its privacy policy that included a maximum period of one month to accept the policy or stay out of the app. The policy would allow users to send messages directly to companies on their platform. It required users to agree for these conversations to be stored on Facebook’s servers, which made many users think that Facebook would have access to their private chats.
Concerns sparked a massive backlash, prompting tens of millions of users to switch to rival apps like Signal and Telegram.
WhatsApp continued to move forward with the change in February, but assured users that the messages would remain private.
“We’ve seen some of our competitors try to run away claiming they can’t see people’s messages; if an app doesn’t offer end-to-end encryption by default, it means they can read your messages,” WhatsApp said its block. “Other applications say they are better because they know even less information than WhatsApp. We believe that people are looking for apps that are reliable and secure, even if that requires WhatsApp to have some limited data. “