• chevron_right

      UK lawmakers vote to jail tech execs who fail to protect kids online

      news.movim.eu / ArsTechnica · Tuesday, 17 January, 2023 - 16:16 · 1 minute

    UK lawmakers vote to jail tech execs who fail to protect kids online

    Enlarge (credit: ilkercelik | E+ )

    The United Kingdom wants to become the safest place for children to grow up online. Many UK lawmakers have argued that the only way to guarantee that future is to criminalize tech leaders whose platforms knowingly fail to protect children. Today, the UK House of Commons reached a deal to appease those lawmakers, Reuters reports, with Prime Minister Rishi Sunak’s government agreeing to modify the Online Safety Bill to ensure its passage. It now appears that tech company executives found to be "deliberately" exposing children to harmful content could soon risk steep fines and jail time of up to two years.

    The agreement was reached during the safety bill's remaining stages before a vote in the House of Commons. Next, it will move on to review by the House of Lords, where the BBC reports it will “face a lengthy journey.” Sunak says he will revise the bill to include new terms before it reaches the House of Lords, where lawmakers will have additional opportunities to revise the wording.

    Reports say that tech executives responsible for platforms hosting user-generated content would only be liable if they fail to take “proportionate measures” to prevent exposing children to harmful content, such as materials featuring child sexual abuse, child abuse, eating disorders, and self-harm. Some measures that tech companies can take to avoid jail time and fines of up to 10 percent of a company's global revenue include adding age verification, providing parental controls, and policing content.

    Read 5 remaining paragraphs | Comments

    • chevron_right

      Musk faces fines if Twitter’s gutted child safety team becomes overwhelmed

      news.movim.eu / ArsTechnica · Tuesday, 29 November, 2022 - 17:40 · 1 minute

    Musk faces fines if Twitter’s gutted child safety team becomes overwhelmed

    Enlarge (credit: SOPA Images / Contributor | LightRocket )

    A few weeks ago, Twitter CEO Elon Musk asked his remaining staff for a show of loyalty by prompting them to click a "yes" link in an email. By clicking yes, the employees were telling Musk that they agreed to work longer hours—if they could keep their jobs. It was Musk’s way of seeing who on his existing team was truly ready to fall in line behind his “ hardcore” efforts to build Twitter 2.0 . Musk quickly learned how unattractive his offer was when an overwhelming number of employees did not click yes, and among those rejecting Musk’s severe terms was apparently almost half of Twitter’s global team dedicated to preventing child sexual exploitation on the platform.

    Three people familiar with Twitter’s current staffing told Bloomberg that when 2022 started, Twitter had 20 team members responsible for reviewing and escalating reports of child sexual abuse materials (CSAM). Today, after layoffs and resignations, there are fewer than 10 specialists forming what Bloomberg described as “an overwhelmed skeleton crew.” It seems that despite Musk continually tweeting that blocking CSAM is Twitter’s top priority and even going so far as inviting users to tweet CSAM directly at him , Musk may already be losing his battle to keep the material off Twitter.

    “Musk didn’t create an environment where the team wanted to stay,” sources told Bloomberg.

    Read 11 remaining paragraphs | Comments

    • chevron_right

      Meta cracks down on teen “sextortion” on Facebook, Instagram

      news.movim.eu / ArsTechnica · Monday, 21 November, 2022 - 21:08

    Meta cracks down on teen “sextortion” on Facebook, Instagram

    Enlarge (credit: The Good Brigade | DigitalVision )

    Last year, the National Center for Missing and Exploited Children (NCMEC) released data showing that it received overwhelmingly more reports of child sexual abuse materials (CSAM) from Facebook than any other web service it tracked. Where other popular social platforms like Twitter and TikTok had tens of thousands of reports, Facebook had 22 million.

    Today, Facebook announced new efforts to limit the spread of some of that CSAM on its platforms. Partnering with NCMEC, Facebook is building a “global platform” to prevent “sextortion” by helping “stop the spread of teens’ intimate images online.”

    “We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent,” Antigone Davis, Facebook’s VP, global head of safety, said in a blog post on Monday.

    Read 12 remaining paragraphs | Comments

    • chevron_right

      Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

      news.movim.eu / ArsTechnica · Thursday, 15 September, 2022 - 20:32 · 1 minute

    Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

    Enlarge (credit: Bloomberg / Contributor | Bloomberg )

    After a TikTok accusing Patreon of ignoring reports and knowingly profiting off accounts posting child sexual abuse materials (CSAM) attracted hundreds of thousands of views, more TikTokers piled on, generating more interest . Patreon immediately responded by branding the TikToks as disinformation. In a blog , Patreon denied allegations, accusing TikTokers of spreading a “conspiracy that Patreon knowingly hosts illegal and child-exploitative material.”

    According to Patreon, the conspiracy theory sprang from a fake post on a job-posting site; Vice later reported that site was Glassdoor. The Glassdoor review was posted in August and claimed that Patreon refused to respond to reports of accounts suspected of “selling lewd photographs” of children. As TikTokers described their failed attempts to report these accounts, Patreon laid off members of its security team and, Patreon said, “onlookers inaccurately linked" the "small-scale staffing changes we made last week to our security organization." The TikTokers claimed that Patreon laid off its staff specifically for not complying with orders to allow CSAM to stay on the platform.

    “Dangerous and conspiratorial disinformation began circulating on social media recently,” Patreon said. “We want to let all of our creators and patrons know that these claims are unequivocally false and set the record straight.”

    Read 8 remaining paragraphs | Comments

    • chevron_right

      “War upon end-to-end encryption”: EU wants Big Tech to scan private messages

      news.movim.eu / ArsTechnica · Wednesday, 11 May, 2022 - 18:05 · 1 minute

    Illustration of an eye on a digital background.

    Enlarge (credit: Getty Images | Yuichiro Chino)

    A European Commission proposal could force tech companies to scan private messages for child sexual abuse material (CSAM) and evidence of grooming, even when those messages are supposed to be protected by end-to-end encryption.

    Online services that receive "detection orders" under the pending European Union legislation would have "obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges," the proposal says. The plan calls end-to-end encryption an important security tool but essentially orders companies to break that end-to-end encryption by whatever technological means necessary:

    In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.

    That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.

    A questions-and-answers document describing the plan emphasizes the importance of scanning end-to-end encrypted messages. "NCMEC [National Center for Missing and Exploited Children] estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services," it says.

    Read 15 remaining paragraphs | Comments