• chevron_right

      Snapchat isn’t liable for connecting 12-year-old to convicted sex offenders

      news.movim.eu / ArsTechnica · Thursday, 22 February - 19:56

    Snapchat isn’t liable for connecting 12-year-old to convicted sex offenders

    Enlarge (credit: Bloomberg / Contributor | Bloomberg )

    A judge has dismissed a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders.

    According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its "Quick Add" feature, Snapchat "directed her" to connect with "a registered sex offender using the profile name JASONMORGAN5660." After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged.

    Two years later, at 14, C.O. connected with another convicted sex offender on Snapchat, a former police officer who offered to give C.O. a ride to school and then sexually assaulted her. The second offender is also currently incarcerated, the judge's opinion noted.

    Read 28 remaining paragraphs | Comments

    • chevron_right

      At Senate AI hearing, news executives fight against “fair use” claims for AI training data

      news.movim.eu / ArsTechnica · Thursday, 11 January - 16:37 · 1 minute

    WASHINGTON, DC - JANUARY 10: Danielle Coffey, President and CEO of News Media Alliance, Professor Jeff Jarvis, CUNY Graduate School of Journalism, Curtis LeGeyt President and CEO of National Association of Broadcasters, Roger Lynch CEO of Condé Nast, are strong in during a Senate Judiciary Subcommittee on Privacy, Technology, and the Law hearing on “Artificial Intelligence and The Future Of Journalism” at the U.S. Capitol on January 10, 2024 in Washington, DC. Lawmakers continue to hear testimony from experts and business leaders about artificial intelligence and its impact on democracy, elections, privacy, liability and news. (Photo by Kent Nishimura/Getty Images)

    Enlarge / Danielle Coffey, president and CEO of News Media Alliance; Professor Jeff Jarvis, CUNY Graduate School of Journalism; Curtis LeGeyt, president and CEO of National Association of Broadcasters; and Roger Lynch, CEO of Condé Nast, are sworn in during a Senate Judiciary Subcommittee on Privacy, Technology, and the Law hearing on “Artificial Intelligence and The Future Of Journalism.” (credit: Getty Images)

    On Wednesday, news industry executives urged Congress for legal clarification that using journalism to train AI assistants like ChatGPT is not fair use, as claimed by companies such as OpenAI . Instead, they would prefer a licensing regime for AI training content that would force Big Tech companies to pay for content in a method similar to rights clearinghouses for music .

    The plea for action came during a US Senate Judiciary Committee hearing titled " Oversight of A.I.: The Future of Journalism ," chaired by Sen. Richard Blumenthal of Connecticut, with Sen. Josh Hawley of Missouri also playing a large role in the proceedings. Last year, the pair of senators introduced a bipartisan framework for AI legislation and held a series of hearings on the impact of AI.

    Blumenthal described the situation as an "existential crisis" for the news industry and cited social media as a cautionary tale for legislative inaction about AI. "We need to move more quickly than we did on social media and learn from our mistakes in the delay there," he said.

    Read 15 remaining paragraphs | Comments

    • chevron_right

      Judge tosses social platforms’ Section 230 blanket defense in child safety case

      news.movim.eu / ArsTechnica · Wednesday, 15 November - 21:05

    Judge tosses social platforms’ Section 230 blanket defense in child safety case

    Enlarge (credit: ljubaphoto | E+ )

    This week, some of the biggest tech companies found out that Section 230 immunity doesn't shield them from some of the biggest complaints alleging that social media platform designs are defective and harming children and teen users.

    On Tuesday, US district judge Yvonne Gonzalez Rogers ruled that discovery can proceed in a lawsuit documenting individual cases involving hundreds of children and teens allegedly harmed by social media use across 30 states. Their complaint alleged that tech companies were guilty of negligently operating platforms with many design defects—including lack of parental controls, insufficient age verification, complicated account deletion processes, appearance-altering filters, and requirements forcing users to log in to report child sexual abuse materials (CSAM)—and failed to warn young users and their parents about those defects.

    Defendants are companies operating "the world’s most used social media platforms: Meta’s Facebook and Instagram, Google’s YouTube, ByteDance’s TikTok, and Snapchat." All of these companies moved to dismiss the multi-district litigation entirely, hoping that the First Amendment and Section 230 immunity would effectively bar all the plaintiffs' claims—including, apparently, claims that companies ignored addressing when moving to dismiss.

    Read 16 remaining paragraphs | Comments

    • chevron_right

      YouTube under no obligation to host anti-vaccine advocate’s videos, court says

      news.movim.eu / ArsTechnica · Tuesday, 5 September, 2023 - 20:08

    YouTube under no obligation to host anti-vaccine advocate’s videos, court says

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    A prominent anti-vaccine activist, Joseph Mercola, yesterday lost a lawsuit attempting to force YouTube to provide access to videos that were removed from the platform after YouTube banned his channels.

    Mercola had tried to argue that YouTube owed him more than $75,000 in damages for breaching its own user contract and denying him access to his videos. However, in an order dismissing Mercola's complaint, US magistrate judge Laurel Beeler wrote that according to the contract Mercola signed, YouTube was "under no obligation to host" Mercola's content after terminating his channel in 2021 "for violating YouTube’s Community Guidelines by posting medical misinformation about COVID-19 and vaccines."

    "The court found no breach because 'there is no provision in the Terms of Service that requires YouTube to maintain particular content' or be a 'storage site for users’ content,'" Beeler wrote.

    Read 14 remaining paragraphs | Comments

    • chevron_right

      SCOTUS spares Section 230, rules Google, Twitter not liable for aiding ISIS

      news.movim.eu / ArsTechnica · Thursday, 18 May, 2023 - 19:59 · 1 minute

    SCOTUS spares Section 230, rules Google, Twitter not liable for aiding ISIS

    Enlarge (credit: Bloomberg / Contributor | Bloomberg )

    Today the United States Supreme Court quashed tech industry fears that the nation's highest court might ruin the Internet by deciding that platforms should be held liable for recommending third-party content that has long been protected by Section 230 of the Communications Decency Act.

    In a pair of rulings, the Supreme Court found that plaintiffs failed to state a claim when arguing that online platforms like YouTube, Twitter, and Facebook should be held liable for aiding and abetting the Islamic State of Iraq and Syria (ISIS) terrorist enterprise by recommending terrorist content ahead of attacks. As a result, both cases, Twitter v. Taamneh and Gonzalez v. Google , have been remanded to a lower court, and at least for now, the Section 230 immunity shield remains fully intact.

    Supreme Court Justice Clarence Thomas delivered the opinion in the Twitter case. He concluded that allegations that Facebook, Twitter, and YouTube knew for years that "ISIS was using their platforms but failed to stop it from doing so" were "insufficient"—even without considering Section 230 protections—to establish that the social platforms aided and abetted a specific 2017 terrorist attack on the Reina nightclub in Istanbul, Turkey. That attack, carried out for ISIS by Abdulkadir Masharipov, killed 39 victims and injured another 69.

    Read 23 remaining paragraphs | Comments

    • chevron_right

      Man battling Google wins $500K for search result links calling him a pedophile

      news.movim.eu / ArsTechnica · Thursday, 20 April, 2023 - 20:13

    Man battling Google wins $500K for search result links calling him a pedophile

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    A Montreal man spent years trying to hold Google accountable for search results linking to a defamatory post falsely accusing him of pedophilia that he said ruined his career. Now Google must pay $500,000 after a Quebec Supreme Court judge ruled that Google relied on an “erroneous” interpretation of Canadian law in denying the man’s requests to remove the links.

    “Google variously ignored the Plaintiff, told him it could do nothing, told him it could remove the hyperlink on the Canadian version of its search engine but not the US one, but then allowed it to re-appear on the Canadian version after a 2011 judgment of the Supreme Court of Canada in an unrelated matter involving the publication of hyperlinks,” judge Azimuddin Hussain wrote in his decision issued on March 28.

    Google did not immediately respond to Ars’ request to comment.

    Read 9 remaining paragraphs | Comments

    • chevron_right

      Twitter struggles to convince SCOTUS it isn’t bolstering terrorists

      news.movim.eu / ArsTechnica · Wednesday, 22 February, 2023 - 21:13 · 1 minute

    Attorney Eric Schnapper speaks to reporters outside of the US Supreme Court following oral arguments for the case Twitter v. Taamneh on February 22, 2023, in Washington, DC.

    Enlarge / Attorney Eric Schnapper speaks to reporters outside of the US Supreme Court following oral arguments for the case Twitter v. Taamneh on February 22, 2023, in Washington, DC. (credit: Anna Moneymaker / Staff | Getty Images North America )

    Today it was Twitter’s turn to argue before the Supreme Court in another case this week that experts fear could end up weakening Section 230 protections for social networks hosting third-party content. In Twitter v. Taamneh , the Supreme Court must decide if under the Justice Against Sponsors of Terrorists Act (JASTA), online platforms should be held liable for aiding and abetting terrorist organizations that are known to be using their services to recruit fighters and plan attacks.

    After close to three hours of arguments, justices still appear divided on how to address the complicated question, and Twitter's defense was not as strong as some justices seemingly thought it could be.

    Twitter attorney Seth Waxman argued that the social network and other defendants, Google and Meta, should not be liable under JASTA, partly because the act of providing the same general services—which anyone on their platforms can access—does not alone constitute providing substantial assistance to an individual planning a terrorist attack.

    Read 16 remaining paragraphs | Comments

    • chevron_right

      SCOTUS “confused” after hearing arguments for weakening Section 230 immunity

      news.movim.eu / ArsTechnica · Tuesday, 21 February, 2023 - 23:23 · 1 minute

    Jose Hernandez and Beatriz Gonzalez, stepfather and mother of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, arrive to speak to the press outside of the US Supreme Court following oral arguments in <em>Gonzalez v. Google</em> on February 21 in Washington, DC.

    Enlarge / Jose Hernandez and Beatriz Gonzalez, stepfather and mother of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, arrive to speak to the press outside of the US Supreme Court following oral arguments in Gonzalez v. Google on February 21 in Washington, DC. (credit: Drew Angerer / Staff | Getty Images News )

    Today, the Supreme Court heard oral arguments to decide whether Section 230 immunity shields online platforms from liabilities when relying on algorithms to make targeted recommendations. Many Section 230 defenders feared that the court might be eager to chip away at the statute’s protections, terrified that in the worst-case scenario, the Supreme Court could doom the Internet as we know it. However, it became clear that justices had grown increasingly concerned about the potential large-scale economic impact of making any decision that could lead to a crash of the digital economy or an avalanche of lawsuits over targeted recommendations.

    The case before the court, Gonzalez v. Google , asks specifically whether Google should be held liable for allegedly violating federal law that prohibits aiding and abetting a terrorist organization by making targeted recommendations that promoted ISIS videos to YouTube users. If the court decides that Section 230 immunity does not apply, that single decision could impact how all online platforms recommend and organize content, Google and many others have argued.

    “Congress was clear that Section 230 protects the ability of online services to organize content,” Halimah DeLaine Prado, Google's general counsel, told Ars in a statement. “Eroding these protections would fundamentally change how the Internet works, making it less open, less safe, and less helpful.”

    Read 18 remaining paragraphs | Comments

    • chevron_right

      Supreme Court allows Reddit mods to anonymously defend Section 230

      news.movim.eu / ArsTechnica · Friday, 20 January, 2023 - 19:29 · 1 minute

    Supreme Court allows Reddit mods to anonymously defend Section 230

    Enlarge (credit: SOPA Images / Contributor | LightRocket )

    Over the past few days, dozens of tech companies have filed briefs in support of Google in a Supreme Court case that tests online platforms’ liability for recommending content . Obvious stakeholders like Meta and Twitter, alongside popular platforms like Craigslist, Etsy, Wikipedia, Roblox, and Tripadvisor, urged the court to uphold Section 230 immunity in the case or risk muddying the paths users rely on to connect with each other and discover information online.

    Out of all these briefs, however, Reddit’s was perhaps the most persuasive . The platform argued on behalf of everyday Internet users, whom it claims could be buried in “frivolous” lawsuits for frequenting Reddit, if Section 230 is weakened by the court. Unlike other companies that hire content moderators, the content that Reddit displays is “primarily driven by humans—not by centralized algorithms.” Because of this, Reddit’s brief paints a picture of trolls suing not major social media companies, but individuals who get no compensation for their work recommending content in communities. That legal threat extends to both volunteer content moderators, Reddit argued, as well as more casual users who collect Reddit “karma” by upvoting and downvoting posts to help surface the most engaging content in their communities.

    “Section 230 of the Communications Decency Act famously protects Internet platforms from liability, yet what’s missing from the discussion is that it crucially protects Internet users—everyday people—when they participate in moderation like removing unwanted content from their communities, or users upvoting and downvoting posts,” a Reddit spokesperson told Ars.

    Read 18 remaining paragraphs | Comments