WhatsApp’s promise of end-to-end encrypted private messages appears to have been false, an investigation revealed.
When Facebook bought the popular WhatsApp for $ 19 billion in 2014, both companies assured users that neither company could access their data, but ProPublica journalists found the claims untrue.
Facebook has not only hired 1,000 workers since then to examine millions of messages on WhatsApp, which has two billion users worldwide, but has also shared some of those messages with law enforcement and the U.S. Department of Justice to help put people in jail. , says the ProPublica.
In the report, ProPublica found that Facebook had hired contractors in Austin, Texas, Dublin, Ireland and Singapore to examine millions of pieces of user content.
“These hourly workers use special Facebook software to examine streams of private messages, images and videos that WhatsApp users have reported as incorrect and then projected by the company’s artificial intelligence systems.” detailed the report.
“These contractors rule out anything that flashes on their screen (claims from everything from fraud or spam to child pornography and potential terrorist conspiracies) usually in less than a minute.”
Will Cathcart, head of WhatsApp, said the news was not a problem.
“I think we can have security and safety for people through end-to-end encryption and work with the police to solve crimes,” Cathcart said.
WhatsApp had helped prosecutors build high-profile cases against Natalie Edwards, a U.S. Treasury Department employee who allegedly leaked confidential documents to BuzzFeed about how gross money circulates through U.S. banks, according to ProPublica.
Edwards was sentenced to six months in prison after pleading guilty to a conspiracy charge. He began serving his sentence in June.
The report also found more than a dozen cases in which WhatsApp data was used to imprison other people since 2017.


WhatsApp chief Will Cathcart, rightly, said he sees no problem with the data-sharing platform being tagged with law enforcement. Facebook CEO Mark Zuckerberg, on the left, led the $ 19 billion WhatsApp acquisition in 2014 and said the data of its “users” would remain private.

WhatsApp, which has more than 2 billion users worldwide, allegedly kept messages private and out of the hands of Facebook, unlike its sister company, Instagram

In the photo, Facebook headquarters in Dublin, Ireland. The tech giant has employees in Dublin and other major cities analyzing WhatsApp data
WhatsApp communications director Carl Woog told ProPublica that Facebook had hired employees to identify and remove the “worst” abusers from the platform, but said he agrees with Cathcart and does not consider the job be content moderation.
“The decisions we make about how we build our app focus on the privacy of our users, while maintaining a high degree of reliability and preventing abuse,” WhatsApp said in a statement.
WhatsApp users seemed baffled by the news, tweeting that it wasn’t surprising that a large technology company owned by Facebook controlled users ’messages.
One user wrote, “I thought we all knew what Facebook was doing?”
‘None of these services are really private. Don’t believe it. In the end, everyone abuses their powers, ”wrote another Twitter user.


People tweeted that they weren’t surprised that WhatsApp shared data

Will Cathcart, on the left, discussed his platform with the Australian Strategic Policy Institute in July. He discussed how he marks possible images of child exploitation, but not that the data could be sifted by employees hired by Facebook
Facebook claims that messages are only examined when flagged as inappropriate content and that personal calls and other messages are still kept out of the reach of the business.
An unnamed whistleblower had filed a complaint last year with the U.S. Securities and Exchange Commission, alleging that WhatApp’s allegations of protecting privacy and user data were false.
The SEC has said it has not seen the complaint and has not taken action against the issue.
While Facebook has refrained from detailing how it controls WhatsApp posts, it openly posts the actions it takes on its own service and Instagram.
The company has said there are about 15,000 moderators to filter the millions of posts on both platforms.
From April to June alone, the company has withdrawn more than 32 million posts depicting adult nudity and sexual activity on Facebook. At the same time, it removed 28 million posts representing child abuse and exploitation.
He also took action against more than 1.8 million such posts on Instagram.
Facebook has a 95% rate of delivering “at least some data” to its users when law enforcement requests it.
On WhatsApp, Cathcart said it reported approximately 400,000 cases of possible child exploitation images at the National Center for Missing and Exploited Children in 2020.
During an interview with the Australian Strategic Policy Institute, Cathcart had attributed the reports to the AI and user platforms that mark the content, but made no mention of the private contracts that would have examined the publications.