Tech

PIZZAGATE: Ex-Meta workers confirm encrypted tech let ‘millions of pedophiles target kids’

Published

on

Earlier this month, Meta introduced encryption for direct communications on Facebook and Instagram in order to preserve users’ privacy.

Encrypted communications are used to prevent anybody other than the sender and recipient from reading the communication’s contents.

The unveiling occurred four years after the project was first revealed — and it had been a significant source of conflict inside the firm. Former Meta engineering director David Erb left in 2019 in protest of the project, he just told the Wall Street Journal.

While at Meta, Erb expressed his concern to superiors that encrypting direct messages on Facebook would shield predators who preyed upon children, but they didn’t listen.

Critics fear that would-be pedophiles can track down children through Facebook’s “People You May Know” feature, which offers suggestions of possible friends who can be added through an online social circle.

“It was a hundred times worse than any of us expected,” Erb told the Journal. “There were millions of pedophiles targeting tens of millions of children.”

In May 2020, Karl Quitter, a Chicago-area man, used an alias, “Mathew Jones,” to solicit sexually explicit photos and videos of at least nine teenage girls based in the Philippines via Facebook. Quitter preyed on the victims’ financial difficulties, using money transfers to the victims’ families to entice the girls to take the sexually explicit images. In a message to one 16-year-old victim in 2020, Quitter promised to send money to her family for medicine and food if she complied with his demands. Facebook investigators flagged Quitter’s messages and turned them over to authorities, according to the Journal. Quitter, 58, pleaded guilty in federal court to sexually exploiting children and was sentenced to 30 years in federal prison.

A Department of Homeland Security investigator who was involved in the Quitter case told the Journal that Facebook’s “trust and safety team’s ability to access messages was instrumental” in bringing about an arrest. Brian Fitzgerald, the head of the Homeland Security’s Chicago office, told the Journal that a random stranger shouldn’t be able to go to encrypted communications with a minor.

Meta, parent company to WhatsApp, has sought to minimize the risks posed by end-to-end encryption technology. The company has spent years developing robust safety measures on Facebook and Instagram to prevent and combat abuse or unlawful activity. Meta also offers many encryption-resilient tools to help keep teens safe, such as reporting suspicious instances to the National Center for Missing and Exploited Children.

Meta is parent company to WhatsApp, the world’s most popular encrypted messaging app. However, WhatsApp users communicate with people they know — unlike Facebook and Instagram, which allow strangers to find each other. Meta’s top competitor on social media, TikTok, does not offer encrypted messaging services because the company said it “place[s] a premium on ensuring that our younger users have a safe experience.” YouTube, owned by Google’s parent company, Alphabet Inc., disabled private messaging in 2019 because it wanted to focus on improving public conversations.

SOURCES: NY POST, WALL STREET JOURNAL

#M904721ScriptRootC1506001 { min-height: 300px; }

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version