Facebook seems to possess something sort of a reverse Midas touch where it infects anything it buys with a particular curse. As a result, social networks thriving before they became Facebook properties are transformed into questionable businesses that believe in violating privacy to form a profit. That even includes WhatsApp, whose claim to fame has always been its end-to-end encryption and privacy guarantee. Now it seems that there could be a little loophole that provides Facebook a touch of flexibility to break that trust truly, usually under the guise of observing the law.
End-to-end encryption means messages are practically scrambled until they reach the intended recipient, and that’s how WhatsApp works generally. There is, however, one particular instance where that doesn’t apply. When a user flags a message as potentially improper, all bets are off, and WhatsApp moderators get a transparent and unscrambled view of that message and other things associated with it.
WhatsApp doesn’t call these quite 1,000 contract workers “moderators,” preferring the term “content reviewers.” Unlike moderators who can delete threads or specific messages, these reviewers can only do three things: ignore the report, place the reported account under watch, or ban the account immediately. A bit like moderators, however, they need a transparent view of the allegedly offending message, four messages before that message, and metadata concerning users therein conversation.
Unfortunately, whistleblowers revealed to Pro Publica that WhatsApp’s guidelines for judging reported messages are confusing, arcane, and even disturbing. It doesn’t help that messages are available in all languages, and Facebook’s automatic translation tool sometimes erroneously labels languages. As a result, mistranslations and misunderstandings have caused companies selling shaving razors to be flagged for selling weapons, and people selling bras are labeled as a sexually oriented business.
While regular, unflagged messages remain strongly encrypted, metadata about WhatsApp users aren’t, and Facebook can readily provide law enforcers with these when requested. Big Tech also has a worrying push to develop AI, which will glean some information even from encrypted messages. Within the end, even a platform or service that technically does offer strong security guarantees still works on a system of trust, and Facebook’s ownership of WhatsApp might not exactly engender that.