How confident are you that no one notices what you write in the WhatsApp app?
Although the company Facebook said that no one could have access to messages, photos and videos that you share with acquaintances through the application when it imposed new terms and conditions of use in early 2021, the reality is different , according to an investigation by Propublica.
According to a survey published by the international independent journalism medium, groups of people are hired by the company to evaluate all the content of conversations that are reported by users.
The role of these employees, located in countries such as Singapore, the United States and Ireland, is to pass judgment on written and audio messages, photos, videos, images, memes and other content that appears in each exchange to determine if it is fraud, child pornography, blackmail, hate messages, spam and even elements that constitute a terrorist plot.
Access to the content is granted by the users themselves once they run the “Report” option in a conversation to alert them to a possible violation of the terms of use of the tool. The notification decodes the last five exchanges so that they can be analyzed by an employee.
This practice of monitoring the content of WhatsApp users was revealed in an internal marketing study of the company to which ProPublica had access, in which it states that this group, consisting of over 1.00 vigilantes, thoroughly examines millions of pieces of content weekly.
On average, a single employee has 600 reports (“tickets”) on their agenda, which are then passed to a second evaluation by an artificial intelligence system.
In an interview with the portal, WhatsApp communications director Carl Woog acknowledged that the company has a group of employees dedicated to reviewing messages to identify and remove the “worst” abusers, but avoided cataloging that task. as “content moderation”.
“The decisions we make regarding how we build the app are centered around the privacy of our users, maintaining a high degree of reliability, and preventing abuse,” the company argued in written statements sent to ProPublica.
According to the investigation, the existence of the WhatsApp user content monitoring group also came to light in an anonymous lawsuit filed last year before the U.S. Securities and Exchange Commission (SEC, in English), which denounced the use of external contractors, artificial intelligence and other methods to examine messages, images and videos.
On the other hand, the report also warns that the files and other information related to the use of the application by WhatsApp users has been used by the authorities in several criminal cases, which proves that Facebook provides this type of content, known as metadata, to government dependencies.