• chevron_right

      Damning probes find Instagram is key link connecting pedophile rings

      news.movim.eu / ArsTechnica · Thursday, 8 June, 2023 - 15:22

    Damning probes find Instagram is key link connecting pedophile rings

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    Instagram has emerged as the most important platform for buyers and sellers of underage sex content, according to investigations from the Wall Street Journal, Stanford Internet Observatory, and the University of Massachusetts Amherst (UMass) Rescue Lab.

    While other platforms play a role in processing payments and delivering content, Instagram is where hundreds of thousands—and perhaps millions—of users search explicit hashtags to uncover illegal "menus" of content that can then be commissioned. Content on offer includes disturbing imagery of children self-harming, "incest toddlers," and minors performing sex acts with animals, as well as opportunities for buyers to arrange illicit meetups with children, the Journal reported.

    Because the child sexual abuse material (CSAM) itself is not hosted on Instagram, platform owner Meta has a harder time detecting and removing these users. Researchers found that even when Meta's trust and safety team does ban users, their efforts are "directly undercut" by Instagram's recommendation system—which allows the networks to quickly reassemble under "backup" accounts that are usually listed in the bios of original accounts for just that purpose of surviving bans.

    Read 24 remaining paragraphs | Comments

    • chevron_right

      Facebook furious at FTC after agency proposes ban on monetizing youth data

      news.movim.eu / ArsTechnica · Wednesday, 3 May, 2023 - 17:44

    Facebook furious at FTC after agency proposes ban on monetizing youth data

    Enlarge (credit: JOSH EDELSON / Contributor | AFP )

    Facebook has not been doing enough to comply with a 2020 privacy order , the Federal Trade Commission (FTC) announced Wednesday. On top of "continuing to give app developers access to users’ private information" that Meta claimed had been cut off, the FTC alleges that Facebook has caused new harm. Perhaps most alarming, the FTC alleges that Facebook's Messenger Kids product misled parents on who could connect to chat with minors and misrepresented who had access to private youth data.

    Now, the FTC has proposed changes to the 2020 order that would prohibit Facebook owner Meta from launching new products on any of its platforms without procuring written FTC compliance confirmation and prevent the company from monetizing any of the youth data it collects across Facebook, Instagram, WhatsApp, and Oculus.

    “Facebook has repeatedly violated its privacy promises,” Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said in a press release .

    Read 10 remaining paragraphs | Comments

    • chevron_right

      Twitter suspended 400K for child abuse content but only reported 8K to police

      news.movim.eu / ArsTechnica · Monday, 6 February, 2023 - 20:01

    Twitter suspended 400K for child abuse content but only reported 8K to police

    Enlarge (credit: NurPhoto / Contributor | NurPhoto)

    Last week, Twitter Safety tweeted that the platform is now “moving faster than ever” to remove child sexual abuse materials (CSAM). It seems, however, that’s not entirely accurate. Child safety advocates told The New York Times that after Elon Musk took over, Twitter started taking twice as long to remove CSAM flagged by various organizations.

    The platform has since improved and is now removing CSAM almost as fast as it was before Musk’s takeover—responding to reports in less than two days—The Times reported. But there still seem to be issues with its CSAM reporting system that continue to delay response times. In one concerning case, a Canadian organization spent a week notifying Twitter daily—as the illegal imagery of a victim younger than 10 spread unchecked—before Twitter finally removed the content.

    "From our standpoint, every minute that that content's up, it's re-victimizing that child," Gavin Portnoy, vice president of communications for the National Center for Missing and Exploited Children (NCMEC), told Ars. "That's concerning to us."

    Read 22 remaining paragraphs | Comments

    • chevron_right

      UK lawmakers vote to jail tech execs who fail to protect kids online

      news.movim.eu / ArsTechnica · Tuesday, 17 January, 2023 - 16:16 · 1 minute

    UK lawmakers vote to jail tech execs who fail to protect kids online

    Enlarge (credit: ilkercelik | E+ )

    The United Kingdom wants to become the safest place for children to grow up online. Many UK lawmakers have argued that the only way to guarantee that future is to criminalize tech leaders whose platforms knowingly fail to protect children. Today, the UK House of Commons reached a deal to appease those lawmakers, Reuters reports, with Prime Minister Rishi Sunak’s government agreeing to modify the Online Safety Bill to ensure its passage. It now appears that tech company executives found to be "deliberately" exposing children to harmful content could soon risk steep fines and jail time of up to two years.

    The agreement was reached during the safety bill's remaining stages before a vote in the House of Commons. Next, it will move on to review by the House of Lords, where the BBC reports it will “face a lengthy journey.” Sunak says he will revise the bill to include new terms before it reaches the House of Lords, where lawmakers will have additional opportunities to revise the wording.

    Reports say that tech executives responsible for platforms hosting user-generated content would only be liable if they fail to take “proportionate measures” to prevent exposing children to harmful content, such as materials featuring child sexual abuse, child abuse, eating disorders, and self-harm. Some measures that tech companies can take to avoid jail time and fines of up to 10 percent of a company's global revenue include adding age verification, providing parental controls, and policing content.

    Read 5 remaining paragraphs | Comments

    • chevron_right

      Big Tech sues to block California’s strict online child-safety law

      news.movim.eu / ArsTechnica · Thursday, 15 December, 2022 - 21:08 · 1 minute

    Big Tech sues to block California’s strict online child-safety law

    Enlarge (credit: Image taken by Mayte Torres | Moment )

    In the last half of 2022 alone, many services—from game platforms designed with kids in mind to popular apps like TikTok or Twitter catering to all ages—were accused of endangering young users, exposing minors to self-harm and financial and sexual exploitation. Some kids died, their parents sued, and some tech companies were shielded from their legal challenges by Section 230 . As regulators and parents alike continue scrutinizing how kids become hooked on visiting favorite web destinations that could put them at risk of serious harms, pressure that's increasingly harder to escape has mounted on tech companies to take more responsibility for protecting child safety online.

    In the United States, shielding kids from online dangers is still a duty largely left up to parents, and some tech companies would prefer to keep it that way. But by 2024, a first-of-its-kind California online child-safety law is supposed to take effect, designed to shift some of that responsibility onto tech companies. California’s Age-Appropriate Design Code Act (AB 2273) will force tech companies to design products and services with child safety in mind, requiring age verification and limiting features like auto-play or minor account discoverability via friend-finding tools. That won’t happen, however, if NetChoice gets its way.

    The tech industry trade association—with members including Meta, TikTok, and Google—this week sued to block the law, arguing in a complaint that the law is not only potentially unconstitutional but also poses allegedly overlooked harms to minors.

    Read 18 remaining paragraphs | Comments

    • chevron_right

      Twitter ditches Trust and Safety Council as Musk tweets fuel harassment

      news.movim.eu / ArsTechnica · Tuesday, 13 December, 2022 - 16:29

    Twitter ditches Trust and Safety Council as Musk tweets fuel harassment

    Enlarge (credit: Anadolu Agency / Contributor | Anadolu )

    Yesterday, Twitter safety chief Ella Irwin was supposed to meet with Twitter’s independent Trust and Safety Council by Zoom for an “open conversation and Q&A,” AP News reported . Instead, council members received an email dismissing them entirely.

    Twitter has declared that it is officially in a “new phase” when it comes to trust and safety.

    “We are reevaluating how best to bring external insights into our product and policy development work,” an email, simply signed "Twitter," informed the council. “As part of this process, we have decided that the Trust and Safety Council is not the best structure to do this.”

    Read 11 remaining paragraphs | Comments

    • chevron_right

      Meta cracks down on teen “sextortion” on Facebook, Instagram

      news.movim.eu / ArsTechnica · Monday, 21 November, 2022 - 21:08

    Meta cracks down on teen “sextortion” on Facebook, Instagram

    Enlarge (credit: The Good Brigade | DigitalVision )

    Last year, the National Center for Missing and Exploited Children (NCMEC) released data showing that it received overwhelmingly more reports of child sexual abuse materials (CSAM) from Facebook than any other web service it tracked. Where other popular social platforms like Twitter and TikTok had tens of thousands of reports, Facebook had 22 million.

    Today, Facebook announced new efforts to limit the spread of some of that CSAM on its platforms. Partnering with NCMEC, Facebook is building a “global platform” to prevent “sextortion” by helping “stop the spread of teens’ intimate images online.”

    “We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent,” Antigone Davis, Facebook’s VP, global head of safety, said in a blog post on Monday.

    Read 12 remaining paragraphs | Comments

    • chevron_right

      Section 230 shields TikTok in child’s “Blackout Challenge” death lawsuit

      news.movim.eu / ArsTechnica · Thursday, 27 October, 2022 - 19:05

    Section 230 shields TikTok in child’s “Blackout Challenge” death lawsuit

    Enlarge (credit: Anadolu Agency / Contributor | Anadolu Agency )

    As lawsuits continue piling up against social media platforms for allegedly causing harms to children, a Pennsylvania court has ruled that TikTok is not liable in one case where a 10-year-old named Nylah Anderson died after attempting to complete a “Blackout Challenge” she discovered on her “For You” page.

    The challenge recommends that users choke themselves until they pass out, and Nylah’s mother, Tawainna Anderson, initially claimed that TikTok’s defective algorithm was responsible for knowingly feeding the deadly video to her child. The mother hoped that Section 230 protections under the Communications Decency Act—which grant social platforms immunity for content published by third parties—would not apply in the case, but ultimately, the judge found that TikTok was immune.

    TikTok’s “algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it,” Judge Paul Diamond wrote in a memorandum before issuing his order. “In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.”

    Read 8 remaining paragraphs | Comments

    • chevron_right

      Roblox sued for allegedly enabling young girl’s sexual, financial exploitation

      news.movim.eu / ArsTechnica · Thursday, 6 October, 2022 - 21:20 · 1 minute

    Roblox sued for allegedly enabling young girl’s sexual, financial exploitation

    Enlarge (credit: SOPA Images / Contributor | LightRocket )

    Through the pandemic, the user-created game platform that’s so popular with kids, Roblox, expanded its user base and decided to go public. Within two years, its value shot from less than $4 billion to $45 billion. Now it’s being sued—along with Discord, Snap, and Meta—by a parent who alleges that during the pandemic, Roblox became the gateway enabling multiple adult users to prey on a 10-year-old girl.

    The lawsuit filed Wednesday in the San Francisco Superior Court shows how sexual predators can exploit multiple social platforms at once to cover their tracks while financially and sexually exploiting children. It alleges that, in 2020, Roblox connected a young girl called S.U. with adult men who abused her for months, manipulating her into sending payments using Roblox currency called Robux and inducing her to share explicit photos on Discord and Snapchat through 2021. As the girl grew increasingly anxious and depressed, the lawsuit alleges that Instagram began recommending self-harm content, and ultimately, S.U. had to withdraw from school after multiple suicide attempts.

    Like many similar product liability lawsuits that social platforms have recently faced for allegedly addicting children and causing harms, this new lawsuit seeks to hold platforms accountable for reportedly continuing to promote the use of features that tech companies know can pose severe risks for minor users. And S.U.’s guardian, known as C.U. in the lawsuit, wants platforms to pay for profiting off systems that allegedly recklessly engage child users.

    Read 24 remaining paragraphs | Comments