• chevron_right

      Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

      news.movim.eu / ArsTechnica · Thursday, 15 September, 2022 - 20:32 · 1 minute

    Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

    Enlarge (credit: Bloomberg / Contributor | Bloomberg )

    After a TikTok accusing Patreon of ignoring reports and knowingly profiting off accounts posting child sexual abuse materials (CSAM) attracted hundreds of thousands of views, more TikTokers piled on, generating more interest . Patreon immediately responded by branding the TikToks as disinformation. In a blog , Patreon denied allegations, accusing TikTokers of spreading a “conspiracy that Patreon knowingly hosts illegal and child-exploitative material.”

    According to Patreon, the conspiracy theory sprang from a fake post on a job-posting site; Vice later reported that site was Glassdoor. The Glassdoor review was posted in August and claimed that Patreon refused to respond to reports of accounts suspected of “selling lewd photographs” of children. As TikTokers described their failed attempts to report these accounts, Patreon laid off members of its security team and, Patreon said, “onlookers inaccurately linked" the "small-scale staffing changes we made last week to our security organization." The TikTokers claimed that Patreon laid off its staff specifically for not complying with orders to allow CSAM to stay on the platform.

    “Dangerous and conspiratorial disinformation began circulating on social media recently,” Patreon said. “We want to let all of our creators and patrons know that these claims are unequivocally false and set the record straight.”

    Read 8 remaining paragraphs | Comments

    • chevron_right

      “War upon end-to-end encryption”: EU wants Big Tech to scan private messages

      news.movim.eu / ArsTechnica · Wednesday, 11 May, 2022 - 18:05 · 1 minute

    Illustration of an eye on a digital background.

    Enlarge (credit: Getty Images | Yuichiro Chino)

    A European Commission proposal could force tech companies to scan private messages for child sexual abuse material (CSAM) and evidence of grooming, even when those messages are supposed to be protected by end-to-end encryption.

    Online services that receive "detection orders" under the pending European Union legislation would have "obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges," the proposal says. The plan calls end-to-end encryption an important security tool but essentially orders companies to break that end-to-end encryption by whatever technological means necessary:

    In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.

    That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.

    A questions-and-answers document describing the plan emphasizes the importance of scanning end-to-end encrypted messages. "NCMEC [National Center for Missing and Exploited Children] estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services," it says.

    Read 15 remaining paragraphs | Comments