• Sc chevron_right

      NSA Buying Bulk Surveillance Data on Americans without a Warrant

      news.movim.eu / Schneier · Monday, 29 January - 20:19

    It finally admitted to buying bulk data on Americans from data brokers, in response to a query by Senator Weyden.

    This is almost certainly illegal, although the NSA maintains that it is legal until it’s told otherwise.

    Some news articles .

    • Sc chevron_right

      OpenAI Is Not Training on Your Dropbox Documents—Today

      news.movim.eu / Schneier · Monday, 18 December - 15:45 · 2 minutes

    There’s a rumor flying around the Internet that OpenAI is training foundation models on your Dropbox documents.

    Here’s CNBC . Here’s Boing Boing . Some articles are more nuanced , but there’s still a lot of confusion .

    It seems not to be true. Dropbox isn’t sharing all of your documents with OpenAI. But here’s the problem: we don’t trust OpenAI. We don’t trust tech corporations. And—to be fair—corporations in general. We have no reason to.

    Simon Willison nails it in a tweet:

    “OpenAI are training on every piece of data they see, even when they say they aren’t” is the new “Facebook are showing you ads based on overhearing everything you say through your phone’s microphone.”

    Willison expands this in a blog post , which I strongly recommend reading in its entirety. His point is that these companies have lost our trust:

    Trust is really important. Companies lying about what they do with your privacy is a very serious allegation.

    A society where big companies tell blatant lies about how they are handling our data—­and get away with it without consequences­—is a very unhealthy society.

    A key role of government is to prevent this from happening. If OpenAI are training on data that they said they wouldn’t train on, or if Facebook are spying on us through our phone’s microphones, they should be hauled in front of regulators and/or sued into the ground.

    If we believe that they are doing this without consequence, and have been getting away with it for years, our intolerance for corporate misbehavior becomes a victim as well. We risk letting companies get away with real misconduct because we incorrectly believed in conspiracy theories.

    Privacy is important, and very easily misunderstood. People both overestimate and underestimate what companies are doing, and what’s possible. This isn’t helped by the fact that AI technology means the scope of what’s possible is changing at a rate that’s hard to appreciate even if you’re deeply aware of the space.

    If we want to protect our privacy, we need to understand what’s going on. More importantly, we need to be able to trust companies to honestly and clearly explain what they are doing with our data.

    On a personal level we risk losing out on useful tools. How many people cancelled their Dropbox accounts in the last 48 hours? How many more turned off that AI toggle, ruling out ever evaluating if those features were useful for them or not?

    And while Dropbox is not sending your data to OpenAI today, it could do so tomorrow with a simple change of its terms of service. So could your bank, or credit card company, your phone company, or any other company that owns your data. Any of the tens of thousands of data brokers could be sending your data to train AI models right now, without your knowledge or consent. (At least, in the US. Hooray for the EU and GDPR.)

    Or, as Thomas Claburn wrote :

    “Your info won’t be harvested for training” is the new “Your private chatter won’t be used for ads.”

    These foundation models want our data. The corporations that have our data want the money. It’s only a matter of time, unless we get serious government privacy regulation.

    • chevron_right

      Online child safety law blocked after Calif. argued face scans not that invasive

      news.movim.eu / ArsTechnica · Tuesday, 19 September, 2023 - 19:05

    Online child safety law blocked after Calif. argued face scans not that invasive

    Enlarge (credit: Click&Boo | Moment )

    A California law requiring a wide range of platforms to estimate ages of users and protect minors from accessing harmful content appears to be just as unconstitutional as a recently blocked law in Texas requiring age verification to access adult content.

    Yesterday, US District Judge Beth Labson Freeman ordered a preliminary injunction stopping California Attorney General Rob Bonta from enforcing the state's Age-Appropriate Design Code Act (CAADCA), finding that the law likely violates the First Amendment.

    "The Court finds that although the stated purpose of the Act—protecting children when they are online—clearly is important," Freeman wrote, "the CAADCA likely violates the First Amendment."

    Read 21 remaining paragraphs | Comments

    • chevron_right

      CDC to regain control of US hospital data after Trump-era seizure, chaos

      news.movim.eu / ArsTechnica · Monday, 15 August, 2022 - 22:33

    An older man in a business suit listens to a woman in a business suit.

    Enlarge / Former president Donald Trump, right, listens to Deborah Birx, former coronavirus response coordinator, as she speaks during a news conference in the White House in Washington, DC, on Thursday, April 23, 2020. (credit: Getty | Bloomberg )

    This December, the US Centers for Disease Control and Prevention will finally regain control of national COVID-19 hospital data—which the agency abruptly lost early in the pandemic to an inexperienced private company with ties to then-President Donald Trump.

    As SARS-CoV-2 raged in the summer of 2020, the Trump administration was busy sabotaging the once-premier public health agency . The administration's meddling included stripping the CDC of its power to collect critical data on COVID-19 patients and pandemic resources in hospitals around the country.

    According to multiple investigative reports at the time, then-White House Coronavirus Task Force Coordinator Deborah Birx was frustrated by the CDC's slow and somewhat messy process of collecting and tidying the data submitted by thousands of hospitals. The data included stats on admissions, patient demographics, bed availability, ventilator use, discharges, and personal protective equipment (PPE) supplies.

    Read 9 remaining paragraphs | Comments

    • Sc chevron_right

      Surveillance of Your Car

      news.movim.eu / Schneier · Tuesday, 2 August, 2022 - 08:51

    TheMarkup has an extensive analysis of connected vehicle data and the companies that are collecting it.

    The Markup has identified 37 companies that are part of the rapidly growing connected vehicle data industry that seeks to monetize such data in an environment with few regulations governing its sale or use.

    While many of these companies stress they are using aggregated or anonymized data, the unique nature of location and movement data increases the potential for violations of user privacy.