• chevron_right

      Kremlin-backed actors spread disinformation ahead of US elections

      news.movim.eu / ArsTechnica · Wednesday, 17 April - 21:55

    Kremlin-backed actors spread disinformation ahead of US elections

    Enlarge (credit: da-kuk/Getty )

    Kremlin-backed actors have stepped up efforts to interfere with the US presidential election by planting disinformation and false narratives on social media and fake news sites, analysts with Microsoft reported Wednesday.

    The analysts have identified several unique influence-peddling groups affiliated with the Russian government seeking to influence the election outcome, with the objective in large part to reduce US support of Ukraine and sow domestic infighting. These groups have so far been less active during the current election cycle than they were during previous ones, likely because of a less contested primary season.

    Stoking divisions

    Over the past 45 days, the groups have seeded a growing number of social media posts and fake news articles that attempt to foment opposition to US support of Ukraine and stoke divisions over hot-button issues such as election fraud. The influence campaigns also promote questions about President Biden’s mental health and corrupt judges. In all, Microsoft has tracked scores of such operations in recent weeks.

    Read 13 remaining paragraphs | Comments

    • chevron_right

      Government-Made Comic Books Try to Fight Election Disinformation

      news.movim.eu / TheIntercept · Monday, 25 March - 21:35 · 7 minutes

    With the 2024 elections looming, the Department of Homeland Security has a little-noticed weapon in its war on disinformation: comic books. Few have read them, but the series is attracting criticism from members of Congress. Calling the comics “creepy,” Rep. Dan Bishop, R-N.C., complained earlier this month that the Cybersecurity and Infrastructure Security Agency-produced series was just another way for the federal government to “trample on the First Amendment” in its zeal to fight so-called disinformation.

    “DC Comics won’t be adding these taxpayer-funded comic books … to their repertoire anytime soon,” cracked Kentucky Sen. Rand Paul’s annual report on government waste released in December.

    The comics read like well-meaning (if corny) attempts to grapple with efforts by foreign governments to influence American public opinion, as articulated in intelligence community assessments . But there is a risk that the federal government’s fight against foreign disinformation positions it as an arbiter of the truth, which raises civil liberties concerns. The efficacy of the DHS “Resilience Series” of comic books is also far from obvious.

    The members of Congress might be comforted to know that few people ever noticed the comics. The Cybersecurity and Infrastructure Security Agency urges users to “share” their “Resilience Series” comics, but a search of the webpage’s address on X shows that it is linked to fewer than a dozen times. CISA also produced glossy-looking YouTube trailers for its two graphic novels that garnered just 4,000 and 6,000 views respectively — a far cry from the hundreds of thousands of views trailers for other graphic novels attract.

    For CISA, disinformation is no laughing matter. “Disinformation is an existential threat to the United States,” declares CISA’s webpage detailing its “ Resilience Series ” of comic books.

    Third in sales by genre, only behind general fiction and romance novels, graphic novels are particularly popular among the youngest readers. One industry observer notes that in Japan, more paper is used for manga books than for toilet paper. School Library Journal concluded in their graphic novels survey last year that popularity increased over 90 percent year over year in school libraries. The survey also found that nearly 60 percent of school librarians reported opposition to graphic novels from teachers, parents, and others who didn’t consider them “real books.”

    Though first released in 2020 in anticipation of the Trump–Biden presidential election, the comics were intended to be an evergreen resource in the war on disinformation. “Learn the dangers & risks associated with dis- & misinformation through fictional stories that are inspired by real-world events in @CISAgov’s Resilience Series,” the U.S. Attorney for Nevada posted on X last April.

    CISA produced two graphic novels, “Real Fake” and “Bug Bytes.” “Real Fake” tells the story of Rachel O’Sullivan, a “gamer” and a “patriot” who infiltrates a troll farm circulating false narratives about elections to American voters. “Bug Bytes” addresses disinformation around Covid-19, following Ava Williams, a journalism student who realizes that a malicious cyber campaign spreading conspiracy theories about 5G technology is inspiring attacks on 5G towers.

    “Fellow comic geeks, assemble!” CISA said when the comic books were initially released. “Let’s band together to take on disinformation and misinformation.” The CISA post quotes another X post by the FBI’s Washington field office recommending the graphic novels and exhorting the importance of “finding trusted information.”

    “The resilience series products were released in 2020 and 2021 to raise awareness on tactics of foreign influence and disinformation,” a spokesperson for CISA told The Intercept, noting that despite continued reference by members of Congress and critics, that this series of comic books has now been discontinued.

    “The problem is not that panels about African troll farms ( Real Fake ) or homegrown antivaxxers ( Bug Bytes ) might make readers feel insecure—it’s that they don’t make readers feel insecure enough,” writes Russ Castronovo, director of University of Wisconsin-Madison’s Center for the Humanities and professor of American studies and English, in Public Books magazine. “Or, more precisely, these comics might be judged aesthetic failures because—due to their proximity to propaganda—they leave little space for the vulnerabilities inherent in the act of reading. So, while readers learn that meddling by foreign powers ‘is scary, especially in an election year,’ the graphic fictions commissioned by US cybersecurity assume reading itself to be a process whereby information (as opposed to disinformation) is obtained, questions are answered, and doubts are resolved.”

    Writing in Bulletin of the Atomic Scientists, Thomas Gaulkin said that “the Resilience Series … conjures a certain jingoism peculiar to government publications that can mimic the very threat being addressed.”

    All of which raises the question as to what role the Department of Homeland Security should play in adjudicating “media literacy,” as the series webpage says.

    Both “Real Fake” and “Bug Bytes” were written by Clint Watts, a former FBI special agent who works as a contributor to MSNBC and is affiliated with Microsoft’s Threat Analysis Center, and Farid Haque, an education technology entrepreneur who is CEO of London-based Erly Stage Studios and was previously CEO of StartUp Britain, a campaign launched by then-U.K. Prime Minister David Cameron.

    Watts, who writes and speaks about Russian influence campaigns, has testified to Congress on the matter and has been affiliated with a number of think tanks, including the Alliance for Securing Democracy, the German Marshall Fund, and the Foreign Policy Research Institute. Clearly knowledgeable, his own writings can sometimes veer into hyperbole — a potent reminder that even experts on disinformation are not infallible.

    “Over the past three years, Russia has implemented and run the most effective and efficient influence campaign in world history,” Watts said in testimony to the Senate Intelligence Committee in 2017. While Russia’s propaganda regarding its first invasion of Ukraine and Crimea was no doubt effective, that employed in 2016 against the U.S. presidential election was “neither well organized nor especially well resourced” according to a detailed study by the Pentagon-backed Rand Corporation. The think tank later concluded that “the impact of Russian efforts in the West has been uncertain.”

    Co-author Haque, according to an interview in Forbes, became involved in the Resilience Series after a chance meeting at a bookstore with actor Mel Brooks’s son, Max Brooks, who would later join Erly Stage’s advisory board and introduce Haque to his Americans contacts, which included Watts.

    “There is now a real need for schools and public authorities to educate young people on how much fake news there is across all forms of media,” Haque told Forbes.

    Related

    The Government Created a New Disinformation Office to Oversee All the Other Ones

    Counter-disinformation has become a cottage industry in the federal government, with offices and programs now dedicated to exposing foreign influence, as The Intercept has previously reported . CISA’s Resilience Series webpage directs questions to an email for the Countering Foreign Influence Task Force (not to be confused with the FBI’s own effort, the Foreign Influence Task Force, or the intelligence community’s Foreign Malign Influence Center). In 2021, the CISA Task Force was replaced by a Misinformation, Disinformation, and Malinformation team according to a government audit , which CISA tells The Intercept has now been rolled into something called “the Election Security and Resilience subdivision.” (Malinformation refers to information based on fact but used out of context to mislead, harm, or manipulate, according to CISA.)

    The proliferation of various counter-disinformation entities has been disjointed, prompting the Department of Homeland Security’s own inspector general to conclude that “DHS does not have a unified, department-wide strategy to set overarching goals and objectives for addressing and mitigating threats from disinformation campaigns that appear in social media.”

    CISA’s mission, originally focused on traditional cyber and critical infrastructure security, evolved in the wake of the 2016 election. In the waning days of the Obama administration, Secretary of Homeland Security Jeh Johnson officially designated the election systems as a part of critical infrastructure. Since then, CISA has expanded its focus to include fighting disinformation, arguing that human thought can be said to constitute infrastructure.

    “One could argue we’re in the business of critical infrastructure, and the most critical infrastructure is our cognitive infrastructure, so building that resilience to misinformation and disinformation, I think, is incredibly important,” CISA Director Jen Easterly said in 2021.

    In pursuit of that cognitive infrastructure, CISA launched the Resilience Series, with an eye to mediums that would appeal to popular audiences.

    “We have to find new ways to engage with people through mediums that use soft power and creative messaging, rather than being seen to preach,” Haque said in the Forbes interview.

    The post Government-Made Comic Books Try to Fight Election Disinformation appeared first on The Intercept .

    • chevron_right

      DHS Using Hamas to Expand Its Reach on College Campuses

      news.movim.eu / TheIntercept · Sunday, 10 March - 17:03 · 5 minutes

    The Department of Homeland Security is stepping up its efforts to penetrate college campuses under the guise of fighting “foreign malign influence,” according to documents and memos obtained by The Intercept. The push comes at the same time that the DHS is quietly undertaking an effort to influence university curricula in an attempt to fight what it calls disinformation.

    In December, the department’s Homeland Security Academic Partnership Council, or HSAPC, sent a report to Secretary Alejandro Mayorkas outlining a plan to combat college campus unrest stemming from Hamas’s October 7 attack on Israel. DHS has used this advisory body — a sympathetic cohort of academics, consultants, and contractors — to gain support for homeland security objectives and recruit on college campuses.

    In one of the recommendations offered in the December 11 report, the Council writes that DHS should “Instruct [its internal office for state and local law enforcement] to work externally with the [International Association of Campus Law Enforcement Administrators] and [National Association of School Resource Officers] to ask Congress to address laws prohibiting DHS from providing certain resources, such as training and information, to private universities and schools. Current limitations serve as a barrier to yielding maximum optimum results.”

    Legal scholars interviewed by The Intercept are uncertain what specific laws the advisory panel is referring to. The DHS maintains multiple outreach efforts and cooperation programs with public and private universities, particularly with regard to foreign students, and it shares information, even sensitive law enforcement information, with campus police forces. Cooperation with regard to speech and political leanings of students and faculty, nevertheless, is far murkier.

    The DHS-funded HSAPC originated in 2012 to bring together higher education and K-12 administrators, local law enforcement officials, and private sector CEOs to open a dialogue between the new department and the American education system. The Council meets on a quarterly basis, with additional meetings scheduled at the discretion of the DHS secretary. The current chair is Elisa Beard, CEO of Teach for America. Other council members include Alberto M. Carvalho, superintendent of the Los Angeles Unified School District; Farnam Jahanian, president of Carnegie Mellon University; Michael H. Schill, president of Northwestern University; Suzanne Walsh, president of Bennett College; and Randi Weingarten, president of the American Federation of Teachers.

    In its December report, the Council recommends that DHS “Immediately address gaps and disconnects in information sharing and clarify DHS resources available to campuses, recognizing the volatile, escalating, and sometimes urgent campus conditions during this Middle East conflict.”

    DHS’s focus on campus protests has President Joe Biden’s blessing, according to the White House. At the end of October, administration officials said they were taking action to combat antisemitism on college campuses, assigning dozens of “cybersecurity and protective security experts at DHS to engage with schools.”

    In response to the White House’s efforts, the Council recommended that Mayorkas “immediately designate an individual to serve as Campus Safety Coordinator and grant them sufficient authority to lead DHS efforts to combat antisemitism and Islamophobia.” That appointment has not yet occurred.

    The Council’s December report says that expansion of homeland security’s effort will “Build a trusting environment that encourages reporting of antisemitic and Islamophobic incidents, threats, and violence.” Through a “partnership approach” promoting collaboration with “federal agencies, campus administrators, law enforcement, and Fusion Centers,” the Council says it hopes that DHS will “establish this culture in lockstep with school officials in communities.” While the Council’s report highlights the critical importance of protecting free speech on campus, it also notes that “Many community members do not understand that free speech comes with limitations, such as threats to physical safety, as well as time, place, and manner restrictions.”

    The recent DHS push for greater impact on campuses wouldn’t be the first time the post-9/11 agency has taken action as a result of anti-war protests. In 2006, an American Civil Liberties Union lawsuit revealed that DHS was monitoring anti-war student groups at multiple California college and feeding that information to the Department of Defense. According to documents the ACLU obtained under the Freedom of Information Act, the intelligence collected on student groups was intended “to alert commanders and staff to potential terrorist activity or apprise them of other force protection issues.”

    Mayorkas wrote on November 14 last year that a DHS academic partnership will develop solutions to thwart not only foreign government theft of national security funded and related research on college campuses but also to actively combat the introduction of “ideas and perspectives” by foreign governments that the government deems opposing U.S. interests.

    “Colleges and universities may also be seen as a forum to promote the malign actors’ ideologies or to suppress opposing worldviews,” Mayorkas said, adding that “DHS reporting has illuminated the evolving risk of foreign malign influence in higher education institutions.” He says that foreign governments and nonstate actors such as nongovernmental organizations are engaged in “funding research and academic programs, both overt and undisclosed, that promote their own favorable views or outcomes.”

    The three tasks assigned by Mayorkas are:

    • “Guidelines and best practices for higher education institutions to reduce the risk of and counter foreign malign influence.”
    • “Consideration of a public-private partnership to enhance collaboration and information sharing on foreign malign influence.”
    • “An assessment of how the U.S. Government can enhance its internal operations and posture to effectively coordinate and address foreign malign influence-related national security risks posed to higher education institutions.”

    The threat left unspoken in Mayorkas’s memo echoes one spoken out loud by then Bush administration Attorney General John Ashcroft in the months after 9/11 , when the first traces of the government’s desire to forge a once unimaginable expansion into public life in America rose to the surface.

    “To those who scare peace-loving people with phantoms of lost liberty,” Ashcroft told members of the Senate Judiciary Committee, “my message is this: Your tactics only aid terrorists, for they erode our national unity and diminish our resolve. They give ammunition to … enemies and pause to … friends.”

    The post DHS Using Hamas to Expand Its Reach on College Campuses appeared first on The Intercept .

    • chevron_right

      Elon Musk’s X allows China-based propaganda banned on other platforms

      news.movim.eu / ArsTechnica · Friday, 16 February - 21:32

    Elon Musk’s X allows China-based propaganda banned on other platforms

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    Lax content moderation on X (aka Twitter) has disrupted coordinated efforts between social media companies and law enforcement to tamp down on "propaganda accounts controlled by foreign entities aiming to influence US politics," The Washington Post reported .

    Now propaganda is "flourishing" on X, The Post said, while other social media companies are stuck in endless cycles, watching some of the propaganda that they block proliferate on X, then inevitably spread back to their platforms.

    Meta, Google, and then-Twitter began coordinating takedown efforts with law enforcement and disinformation researchers after Russian-backed influence campaigns manipulated their platforms in hopes of swaying the 2016 US presidential election.

    Read 15 remaining paragraphs | Comments

    • chevron_right

      From toy to tool: DALL-E 3 is a wake-up call for visual artists—and the rest of us

      news.movim.eu / ArsTechnica · Thursday, 16 November - 12:20 · 1 minute

    An composite of three DALL-E 3 AI art generations: an oil painting of Hercules fighting a shark, an photo of the queen of the universe, and a marketing photo of "Marshmallow Menace" cereal.

    Enlarge / A composite of three DALL-E 3 AI art generations: an oil painting of Hercules fighting a shark, a photo of the queen of the universe, and a marketing photo of "Marshmallow Menace" cereal. (credit: DALL-E 3 / Benj Edwards)

    In October, OpenAI launched its newest AI image generator—DALL-E 3— into wide release for ChatGPT subscribers. DALL-E can pull off media generation tasks that would have seemed absurd just two years ago—and although it can inspire delight with its unexpectedly detailed creations, it also brings trepidation for some. Science fiction forecast tech like this long ago, but seeing machines upend the creative order feels different when it's actually happening before our eyes.

    "It’s impossible to dismiss the power of AI when it comes to image generation," says Aurich Lawson , Ars Technica's creative director. "With the rapid increase in visual acuity and ability to get a usable result, there’s no question it’s beyond being a gimmick or toy and is a legit tool."

    With the advent of AI image synthesis, it's looking increasingly like the future of media creation for many will come through the aid of creative machines that can replicate any artistic style, format, or medium. Media reality is becoming completely fluid and malleable. But how is AI image synthesis getting more capable so rapidly—and what might that mean for artists ahead?

    Read 43 remaining paragraphs | Comments

    • chevron_right

      100+ researchers say they stopped studying X, fearing Elon Musk might sue them

      news.movim.eu / ArsTechnica · Monday, 6 November - 20:06 · 1 minute

    100+ researchers say they stopped studying X, fearing Elon Musk might sue them

    Enlarge (credit: WPA Pool / Pool | Getty Images Europe )

    At a moment when misinformation about the Israel-Hamas war is rapidly spreading on X (formerly Twitter)— mostly by verified X users —many researchers have given up hope that it will be possible to closely monitor this kind of misinformation on the platform, Reuters reported .

    According to a "survey of 167 academic and civil society researchers conducted at Reuters' request by the Coalition for Independent Technology Research" (CITR) in September, more than 100 studies about X have been canceled, suspended, or switched to focus on another platform since Elon Musk began limiting researchers' access to X data last February . Researchers told Reuters that includes studies on hate speech and child safety, as well as research tracking the "spread of false information during real-time events, such as Hamas' attack on Israel and the Israeli airstrikes in Gaza."

    The European Union has already threatened X with fines if the platform fails to stop the spread of Israel/Hamas disinformation. In response, X has reported taking actions to curb misinformation, like removing newly created Hamas-affiliated accounts and accounts manipulating trending topics, working with partner organizations to flag terrorist content, actioning "tens of thousands of posts," and proactively monitoring for antisemitic speech.

    Read 10 remaining paragraphs | Comments

    • chevron_right

      Creators confused by Elon Musk’s plan to “incentivize truth” on X

      news.movim.eu / ArsTechnica · Monday, 30 October - 17:45

    Creators confused by Elon Musk’s plan to “incentivize truth” on X

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    After researchers flagged verified users on X (formerly known as Twitter) as top superspreaders of Israel/Hamas misinformation and the European Union launched a probe into X, Elon Musk has vowed to get verified X users back in check.

    On Sunday, Musk announced that "any posts that are corrected by @CommunityNotes"—X's community-sourced fact-checking feature—will "become ineligible for revenue share."

    "The idea is to maximize the incentive for accuracy over sensationalism," Musk said, warning that "any attempts to weaponize @CommunityNotes to demonetize people will be immediately obvious, because all code and data is open source."

    Read 14 remaining paragraphs | Comments

    • chevron_right

      YouTuber must pay $40K in attorneys’ fees for daft “reverse censorship” suit

      news.movim.eu / ArsTechnica · Friday, 10 March, 2023 - 20:24

    YouTuber must pay $40K in attorneys’ fees for daft “reverse censorship” suit

    Enlarge (credit: picture alliance / Contributor | picture alliance )

    A YouTuber, Marshall Daniels—who has posted far-right-leaning videos under the name “Young Pharaoh” since 2015—tried to argue that YouTube violated his First Amendment rights by removing two videos discussing George Floyd and COVID-19. Years later, Daniels now owes YouTube nearly $40,000 in attorney fees for filing a frivolous lawsuit against YouTube owner Alphabet, Inc.

    A United States magistrate judge in California, Virginia K. DeMarchi, ordered Daniels to pay YouTube $38,576 for asserting a First Amendment claim that “clearly lacked merit and was frivolous from the outset.” YouTube said this represents a conservative estimate and likely an underestimate of fees paid defending against the meritless claim.

    In his defense, Daniels never argued that the fees Alphabet was seeking were excessive or could be burdensome. In making this rare decision in favor of the defendant Alphabet, DeMarchi had to consider Daniels’ financial circumstances. In his court filings, Daniels described himself as “a fledgling individual consumer,” but also told the court that he made more than $180,000 in the year before he filed his complaint. DeMarchi ruled that the fees would not be a burden to Daniels.

    Read 6 remaining paragraphs | Comments

    • chevron_right

      Twitter hit with EU yellow card for lack of transparency on disinformation

      news.movim.eu / ArsTechnica · Thursday, 9 February, 2023 - 16:43 · 1 minute

    Twitter hit with EU yellow card for lack of transparency on disinformation

    Enlarge (credit: NurPhoto / Contributor | NurPhoto )

    The European Commission, which is tasked with tackling disinformation online, this week expressed disappointment that Twitter has failed to provide required data that all other major platforms submitted. Now Twitter has been hit with a "yellow card," Reuters reported , and could be subjected to fines if the platform doesn’t fully comply with European Union commitments by this June.

    “We must have more transparency and cannot rely on the online platforms alone for the quality of information,” the commission’s vice president of values and transparency, Věra Jourová, said in a press release . “They need to be independently verifiable. I am disappointed to see that Twitter['s] report lags behind others, and I expect a more serious commitment to their obligations.”

    Earlier this month, the EU’s commissioner for the internal market, Thierry Breton, met with Twitter CEO Elon Musk to ensure that Musk understood what was expected of Twitter under the EU’s new Digital Services Act (DSA). After their meeting, Musk tweeted that the EU’s “goals of transparency, accountability & accuracy of information are aligned” with Twitter’s goals. But he also indicated that Twitter would be relying on Community Notes , which let users add context to potentially misleading tweets to satisfy DSA requirements on stopping misinformation and disinformation spread. That process seems to be the issue the commission has with Twitter’s unsatisfactory report.

    Read 13 remaining paragraphs | Comments