Tuesday, February 24


Newly-available court documents have revealed that senior executives at Facebook-parent company Meta internally warned the company that its plan to encrypt Facebook Messenger was “irresponsible” because it would lead to a massive drop in the detection of child exploitation and terrorism material on the platform. The internal communications, made public in a New Mexico state court case last week but were unreported, show how the company went ahead with CEO Mark Zuckerberg’s public push for privacy despite being fears of his top safety officials.According to a report by news agency Reuters, in a 2019 internal chat exchange – written just as Zuckerberg was preparing to announce the shift to end-to-end encryption – Meta’s Head of Content Policy, Monika Bickert, gave a blunt take“We are about to do a bad thing as a company. This is so irresponsible,” Bickert said, later adding that the company was making “gross misstatements” about its ability to keep users safe as it will hide the content of messages from its own moderation systems. End-to-end encryption means that only the sender and receiver can read a message.The court filings include a 2019 document that contained estimates with Meta’s safety team calculating that if encryption had been in place the previous year, the total reports of child sexual exploitation imagery would have plummeted from 18.4 million to just 6.4 million – a 65% decrease. The company estimated it would have been unable to alert police to 600 child exploitation cases, 152 terrorism cases, and 9 threatened school shootings.“There is no way to find the terror attack planning or child exploitation” under this system, Bickert warned in the documents.

What Meta said on the data in court documents

Meta spokesperson Andy Stone responded by saying that these concerns led Meta to work on additional safety features before the company launched encrypted messaging on Facebook and Instagram in 2023.“The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats,” Stone said. The measures included the creation of special accounts for underage users which prevent adult users from initiating contact with minors they do not know.



Source link

Share.
Leave A Reply

Exit mobile version