It appears that WhatsApp may not be as private as we all believed. WhatsApp has been marketing itself as a communications service that prioritizes privacy. However, a recent in-depth investigation by ProPublica provided insight into the inner workings of WhatsApp's moderation mechanism and how it functions in reality. Actually, the report makes public details that its consumers have been kept in the dark thus far. To cut a long tale short, WhatsApp has the ability to read messages transmitted between users in certain situations.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgc59ZsEacgE1LXItlybZEvMhHo4rhI6qYIHWMVCMxunqjkPl4BVRGxBXoUgOjUjx-1DcgEro8f15YAvcTT1cNrxPc-VvQue4nSebWwh5GzqcSY3E5hOIk6ozlpTMaTrgP3tk5QfgfwrQO7LHBDAd4EIUy8QLnKeisN_Rl4ympQXIVbPy4yiujY0-c9spU/s320/image%20(4).jpg) |
© Unsplash/aarn-giri |
According to the article, WhatsApp presently employs at least 1,000 contractors who review content identified by the machine learning algorithm or when reported by other users using "special Facebook software." The "special software" looks for a variety of information, including child pornography, terrorist activity, and spam.
WhatsApp has said clearly time and time again that only text message senders and recipients are able to view their discussions. The instant messaging app's rollout of end-to-end encryption was a big talking point for the corporation. What is absolutely at odds with WhatsApp's messaging and marketing is the fact that the messaging app has a moderation system in place that can view messages with "special software."
The Justification for the Content Review System on WhatsApp
The justification provided by WhatsApp for the content filtering mechanism is both necessary and comprehensible. ProPublica was informed by the instant messaging network that the procedure enables the business to ban aggressive and destructive individuals. Additionally, according to the firm, users must start the reporting procedure before the content evaluation system will function. WhatsApp "unscrambles" four prior communications in addition to any offending messages when a user is reported. Moderators are able to view these messages, but neither they nor the "special software" have access to the complete chat library.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjX3bNujQW1zRyp5N9ty32lm7p7mY-hqrdLTCHGVV13_jZFI613eXTJ-bVvigJzXcEGi9-FP4RbLMw4KvdO5-hplALIz86ENCs_bhO2xnZ6cSbe6dFS1HHCD1GDBW0oPPS3li2ENLWYhC8R-gJKKSxignDtBDPShqQmLV-oULqSqTQ-a6jDTXqnJ6YLJ-k/s320/image%20(5).jpg) |
© Unsplash/rachit-tank-lZBs |
Moderators then have the option to put the person on a "watch list," ban their account, or ignore the reported message. Nevertheless, the paper asserts that some unencrypted data is also readable. The phone number, status message, unique mobile ID, and groups of accounts that have been flagged by moderators as engaging in questionable behavior will be disclosed to them. ProPublica claims that more data, such battery level or signal strength, is also accessible.
Although having a content moderation system in place makes sensible, neither Facebook nor neater WhatsApp have provided explicit information regarding the system up to this point. The issue is brought about by WhatsApp's ambiguity and lack of openness with its users. Facebook thinks that users of WhatsApp aren't actually having any issues with the review system. "We're confident people understand that when they make reports to WhatsApp, we receive the content they send us," Facebook told ProPublica, citing user feedback.
Source: ProPublica
Comments
Post a Comment