• chevron_right

      Musk stiffing Google could unleash yet more abuse on Twitter, report says

      news.movim.eu / ArsTechnica · Monday, 12 June, 2023 - 17:03

    Musk stiffing Google could unleash yet more abuse on Twitter, report says

    Enlarge (credit: SOPA Images / Contributor | LightRocket )

    In what might be another blow to the stability of Twitter's trust and safety efforts, the company has allegedly stopped paying for Google Cloud and Amazon Web Services (AWS), which host tools that support the platform's safety measures, Platformer reported this weekend.

    According to Platformer, Twitter relies on Google Cloud to host services "related to fighting spam, removing child sexual abuse material, and protecting accounts, among other things." That contract is up for renewal at the end of this month after being negotiated and signed prior to Elon Musk's takeover. Since "at least" March, Twitter has been pushing to renegotiate the contract ahead of renewal—unsurprisingly seeking to lower costs, Platformer reported.

    But now it's unclear if the companies will find agreeable new terms on time or if Musk already intends to cancel the contract. Platformer reported that Twitter is rushing to transition services off the Google Cloud Platform and seemingly plans to drop the contract amid failed negotiations.

    Read 19 remaining paragraphs | Comments

    • chevron_right

      Damning probes find Instagram is key link connecting pedophile rings

      news.movim.eu / ArsTechnica · Thursday, 8 June, 2023 - 15:22

    Damning probes find Instagram is key link connecting pedophile rings

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    Instagram has emerged as the most important platform for buyers and sellers of underage sex content, according to investigations from the Wall Street Journal, Stanford Internet Observatory, and the University of Massachusetts Amherst (UMass) Rescue Lab.

    While other platforms play a role in processing payments and delivering content, Instagram is where hundreds of thousands—and perhaps millions—of users search explicit hashtags to uncover illegal "menus" of content that can then be commissioned. Content on offer includes disturbing imagery of children self-harming, "incest toddlers," and minors performing sex acts with animals, as well as opportunities for buyers to arrange illicit meetups with children, the Journal reported.

    Because the child sexual abuse material (CSAM) itself is not hosted on Instagram, platform owner Meta has a harder time detecting and removing these users. Researchers found that even when Meta's trust and safety team does ban users, their efforts are "directly undercut" by Instagram's recommendation system—which allows the networks to quickly reassemble under "backup" accounts that are usually listed in the bios of original accounts for just that purpose of surviving bans.

    Read 24 remaining paragraphs | Comments

    • chevron_right

      Reddit cracked down on revenge porn, creepshots with twofold spike in permabans

      news.movim.eu / ArsTechnica · Wednesday, 29 March, 2023 - 18:20

    Reddit cracked down on revenge porn, creepshots with twofold spike in permabans

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    A year after Reddit updated its policy on non-consensual intimate image (NCII) sharing—a category that includes everything from revenge porn to voyeurism and accidental nip slips—the social media platform has announced that it has gotten much better at detecting and removing this kind of content. Reddit has also launched a transparency center where users can more easily assess Reddit's ongoing efforts to make the platform safer.

    According to Reddit’s 2022 Transparency Report —which tracks various “ongoing efforts to keep Reddit safe, healthy, and real”—last year Reddit removed much more NCII than it did in 2021. The latest report shows that Reddit removed 473 percent more subreddits and permanently suspended 244 percent more user accounts found to be violating community guidelines by sharing non-consensual intimate media. Previously, Reddit labeled NCII as "involuntary pornography," and the 2022 report still uses that label, reporting that the total number of posts removed was 187,258. That includes non-consensual AI-generated deepfakes , also known as “lookalike” pornography.

    “It’s likely this increase is primarily reflective of our updated policies and increased effectiveness in detecting and removing non-consensual intimate media from Reddit,” the transparency report said.

    Read 13 remaining paragraphs | Comments

    • chevron_right

      Twitter suspended 400K for child abuse content but only reported 8K to police

      news.movim.eu / ArsTechnica · Monday, 6 February, 2023 - 20:01

    Twitter suspended 400K for child abuse content but only reported 8K to police

    Enlarge (credit: NurPhoto / Contributor | NurPhoto)

    Last week, Twitter Safety tweeted that the platform is now “moving faster than ever” to remove child sexual abuse materials (CSAM). It seems, however, that’s not entirely accurate. Child safety advocates told The New York Times that after Elon Musk took over, Twitter started taking twice as long to remove CSAM flagged by various organizations.

    The platform has since improved and is now removing CSAM almost as fast as it was before Musk’s takeover—responding to reports in less than two days—The Times reported. But there still seem to be issues with its CSAM reporting system that continue to delay response times. In one concerning case, a Canadian organization spent a week notifying Twitter daily—as the illegal imagery of a victim younger than 10 spread unchecked—before Twitter finally removed the content.

    "From our standpoint, every minute that that content's up, it's re-victimizing that child," Gavin Portnoy, vice president of communications for the National Center for Missing and Exploited Children (NCMEC), told Ars. "That's concerning to us."

    Read 22 remaining paragraphs | Comments

    • chevron_right

      Former Trump official led feds to Telegram group livestreaming child abuse

      news.movim.eu / ArsTechnica · Friday, 3 February, 2023 - 19:24

    Former Trump official led feds to Telegram group livestreaming child abuse

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    New details have been revealed through recently unsealed Cook County court documents, showing how federal investigators in 2020 gained access to encrypted Telegram messages to uncover “a cross-country network of people sexually exploiting children.”

    The Chicago Sun-Times reported that Homeland Security Investigations (HSI) agents based in Arizona launched “Operation Swipe Left” in 2020 to investigate claims of kidnapping, livestreaming child abuse, and production and distribution of child sexual abuse materials (CSAM). That investigation led to criminal charges filed against at least 17 people. The majority of defendants were living in Arizona, but others charged were residents of Illinois, Wisconsin, Washington, DC, California, and South Africa. Ten children were rescued, including four children actively suffering abuse at the time of the rescue. The youngest victim identified was 6 months old, and the oldest was 17 years old.

    Telegram became a preferred tool for defendants in this investigation, many of whom believed that police could never access their encrypted messages. At least one federal prosecutor told a judge that authorities never would have gained access; however, one of the defendants, Adam Hageman, “fully cooperated” with investigators and granted access through his account to offending Telegram groups.

    Read 8 remaining paragraphs | Comments

    • chevron_right

      UK lawmakers vote to jail tech execs who fail to protect kids online

      news.movim.eu / ArsTechnica · Tuesday, 17 January, 2023 - 16:16 · 1 minute

    UK lawmakers vote to jail tech execs who fail to protect kids online

    Enlarge (credit: ilkercelik | E+ )

    The United Kingdom wants to become the safest place for children to grow up online. Many UK lawmakers have argued that the only way to guarantee that future is to criminalize tech leaders whose platforms knowingly fail to protect children. Today, the UK House of Commons reached a deal to appease those lawmakers, Reuters reports, with Prime Minister Rishi Sunak’s government agreeing to modify the Online Safety Bill to ensure its passage. It now appears that tech company executives found to be "deliberately" exposing children to harmful content could soon risk steep fines and jail time of up to two years.

    The agreement was reached during the safety bill's remaining stages before a vote in the House of Commons. Next, it will move on to review by the House of Lords, where the BBC reports it will “face a lengthy journey.” Sunak says he will revise the bill to include new terms before it reaches the House of Lords, where lawmakers will have additional opportunities to revise the wording.

    Reports say that tech executives responsible for platforms hosting user-generated content would only be liable if they fail to take “proportionate measures” to prevent exposing children to harmful content, such as materials featuring child sexual abuse, child abuse, eating disorders, and self-harm. Some measures that tech companies can take to avoid jail time and fines of up to 10 percent of a company's global revenue include adding age verification, providing parental controls, and policing content.

    Read 5 remaining paragraphs | Comments

    • chevron_right

      Musk faces fines if Twitter’s gutted child safety team becomes overwhelmed

      news.movim.eu / ArsTechnica · Tuesday, 29 November, 2022 - 17:40 · 1 minute

    Musk faces fines if Twitter’s gutted child safety team becomes overwhelmed

    Enlarge (credit: SOPA Images / Contributor | LightRocket )

    A few weeks ago, Twitter CEO Elon Musk asked his remaining staff for a show of loyalty by prompting them to click a "yes" link in an email. By clicking yes, the employees were telling Musk that they agreed to work longer hours—if they could keep their jobs. It was Musk’s way of seeing who on his existing team was truly ready to fall in line behind his “ hardcore” efforts to build Twitter 2.0 . Musk quickly learned how unattractive his offer was when an overwhelming number of employees did not click yes, and among those rejecting Musk’s severe terms was apparently almost half of Twitter’s global team dedicated to preventing child sexual exploitation on the platform.

    Three people familiar with Twitter’s current staffing told Bloomberg that when 2022 started, Twitter had 20 team members responsible for reviewing and escalating reports of child sexual abuse materials (CSAM). Today, after layoffs and resignations, there are fewer than 10 specialists forming what Bloomberg described as “an overwhelmed skeleton crew.” It seems that despite Musk continually tweeting that blocking CSAM is Twitter’s top priority and even going so far as inviting users to tweet CSAM directly at him , Musk may already be losing his battle to keep the material off Twitter.

    “Musk didn’t create an environment where the team wanted to stay,” sources told Bloomberg.

    Read 11 remaining paragraphs | Comments

    • chevron_right

      Meta cracks down on teen “sextortion” on Facebook, Instagram

      news.movim.eu / ArsTechnica · Monday, 21 November, 2022 - 21:08

    Meta cracks down on teen “sextortion” on Facebook, Instagram

    Enlarge (credit: The Good Brigade | DigitalVision )

    Last year, the National Center for Missing and Exploited Children (NCMEC) released data showing that it received overwhelmingly more reports of child sexual abuse materials (CSAM) from Facebook than any other web service it tracked. Where other popular social platforms like Twitter and TikTok had tens of thousands of reports, Facebook had 22 million.

    Today, Facebook announced new efforts to limit the spread of some of that CSAM on its platforms. Partnering with NCMEC, Facebook is building a “global platform” to prevent “sextortion” by helping “stop the spread of teens’ intimate images online.”

    “We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent,” Antigone Davis, Facebook’s VP, global head of safety, said in a blog post on Monday.

    Read 12 remaining paragraphs | Comments

    • chevron_right

      Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

      news.movim.eu / ArsTechnica · Thursday, 15 September, 2022 - 20:32 · 1 minute

    Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory

    Enlarge (credit: Bloomberg / Contributor | Bloomberg )

    After a TikTok accusing Patreon of ignoring reports and knowingly profiting off accounts posting child sexual abuse materials (CSAM) attracted hundreds of thousands of views, more TikTokers piled on, generating more interest . Patreon immediately responded by branding the TikToks as disinformation. In a blog , Patreon denied allegations, accusing TikTokers of spreading a “conspiracy that Patreon knowingly hosts illegal and child-exploitative material.”

    According to Patreon, the conspiracy theory sprang from a fake post on a job-posting site; Vice later reported that site was Glassdoor. The Glassdoor review was posted in August and claimed that Patreon refused to respond to reports of accounts suspected of “selling lewd photographs” of children. As TikTokers described their failed attempts to report these accounts, Patreon laid off members of its security team and, Patreon said, “onlookers inaccurately linked" the "small-scale staffing changes we made last week to our security organization." The TikTokers claimed that Patreon laid off its staff specifically for not complying with orders to allow CSAM to stay on the platform.

    “Dangerous and conspiratorial disinformation began circulating on social media recently,” Patreon said. “We want to let all of our creators and patrons know that these claims are unequivocally false and set the record straight.”

    Read 8 remaining paragraphs | Comments