phone

    • chevron_right

      Twitter ditches free access to data, potentially hindering research

      news.movim.eu / ArsTechnica · Friday, 10 February, 2023 - 16:49

    Image of blue birds with speech bubbles.

    Enlarge (credit: Sean Gladwell )

    Twitter-owner Elon Musk has recently decided to close down free access to Twitter's application programming interface (API), which gives users access to tweet data. There are many different uses for the data provided by the social media platform. Third-party programs like Tweetbot—which helps users customize their feeds—have used Twitter's APIs, for example.

    Experts in the field say the move could harm academic research by hindering access to data used in papers that analyze behavior on social media. When USC professor of computer science Kristina Lerman first heard about the move, she said her team started “scrambling to connect to collect the data we need for some of the projects we have going on this semester,” though the urgency subsided when more details were released, she told Ars.

    Twitter will begin offering basic access to its API for $100 per month. There are few if any details released yet, but Twitter’s website shows that there are tiers of access with different tweet access limits, along with other limits on features like filtering. The higher tiers cost more.

    Read 7 remaining paragraphs | Comments

    • chevron_right

      What are companies doing with D-Wave’s quantum hardware?

      news.movim.eu / ArsTechnica · Monday, 2 January, 2023 - 12:00

    What are companies doing with D-Wave’s quantum hardware?

    Enlarge (credit: Getty Images)

    While many companies are now offering access to general-purpose quantum computers, they're not currently being used to solve any real-world problems, as they're held back by issues with qubit count and quality. Most of their users are either running research projects or simply gaining experience with programming on the systems in the expectation that a future computer will be useful.

    There are quantum systems based on superconducting hardware that are being used commercially; it's just that they're not general-purpose computers.

    D-Wave offers what's called a quantum annealer. The hardware is a large collection of linked superconducting devices that use quantum effects to reach energetic ground states for the system. When properly configured, this end state represents the solution to a mathematical problem. Annealers can't solve the same full range of mathematical problems as general-purpose quantum computers, such as the ones made by Google, IBM, and others. But they can be used to solve a variety of optimization problems.

    Read 24 remaining paragraphs | Comments

    • chevron_right

      DeepMind’s latest AI project solves programming challenges like a newb

      news.movim.eu / ArsTechnica · Thursday, 8 December, 2022 - 21:15 · 1 minute

    Blurred hands are typing on a laptop computer in the dark with illuminated keyboard and illegible mystic program code visible on the screen.

    Enlarge / If an AI were asked to come up with an image for this article, would it think of The Matrix ? (credit: EThamPhoto )

    Google's DeepMind AI division has tackled everything from StarCraft to protein folding . So it's probably no surprise that its creators have eventually turned to what is undoubtedly a personal interest: computer programming. In Thursday's edition of Science, the company describes a system it developed that produces code in response to programming typical of those used in human programming contests.

    On an average challenge, the AI system could score near the top half of participants. But it had a bit of trouble scaling, being less likely to produce a successful program on problems where more code is typically required. Still, the fact that it works at all without having been given any structural information about algorithms or programming languages is a bit of a surprise.

    Rising to the challenge

    Computer programming challenges are fairly simple: People are given a task to complete and produce code that should perform the requested task. In an example given in the new paper, programmers are given two strings and asked to determine whether the shorter of the two could be produced by substituting backspaces for some of the keypresses needed to type the larger one. Submitted programs are then checked to see whether they provide a general solution to the problem or fail when additional examples are tested.

    Read 16 remaining paragraphs | Comments

    • chevron_right

      Robots will roam a university to study “a socio-technical problem”

      news.movim.eu / ArsTechnica · Friday, 11 November, 2022 - 17:23

    Image of a four-legged robot with a blender on its back.

    Enlarge (credit: Boston Dynamics )

    Will robots take over the world? Will our new machine overlords be generous gods or cruel taskmasters? A new research project isn’t going to answer these questions, but it aims to highlight how humans perceive and interact with some of our automatons in public.

    Researchers at the University of Texas at Austin recently received expanded funding from the National Science Foundation to continue their work studying human-robot interactions. To do this, the team plans to release four-legged robots around the university campus and collect data on what it finds. The project will begin in 2023 and run for five years.

    “When we deploy robots in the real world, it's not just a technical problem, it's actually a socio-technical problem,” Joydeep Biswas, assistant professor of computer science in the College of Natural Sciences and member of the research team, told Ars.

    Read 7 remaining paragraphs | Comments

    • chevron_right

      IBM pushes qubit count over 400 with new processor

      news.movim.eu / ArsTechnica · Wednesday, 9 November, 2022 - 22:43 · 1 minute

    IBM pushes qubit count over 400 with new processor

    Enlarge (credit: IBM )

    Today, IBM announced the latest generation of its family of avian-themed quantum processors, the Osprey. With more than three times the qubit count of its previous-generation Eagle processor, Osprey is the first to offer more than 400 qubits, which indicates the company remains on track to release the first 1,000-qubit processor next year.

    Despite the high qubit count, there's no need to rush out and re-encrypt all your sensitive data just yet. While the error rates of IBM's qubits have steadily improved, they've still not reached the point where all 433 qubits in Osprey can be used in a single algorithm without a very high probability of an error. For now, IBM is emphasizing that Osprey is an indication that the company can stick to its aggressive road map for quantum computing, and that the work needed to make it useful is in progress.

    On the road

    To understand IBM's announcement, it helps to understand the quantum computing market as a whole. There are now a lot of companies in the quantum computing market, from startups to large, established companies like IBM, Google, and Intel. They've bet on a variety of technologies, from trapped atoms to spare electrons to superconducting loops. Pretty much all of them agree that to reach quantum computing's full potential, we need to get to where qubit counts are in the tens of thousands, and error rates on each individual qubit are low enough that these can be linked together into a smaller number of error-correcting qubits.

    Read 14 remaining paragraphs | Comments

    • chevron_right

      Why are hard drive companies investing in DNA data storage?

      news.movim.eu / ArsTechnica · Thursday, 15 September, 2022 - 16:28 · 1 minute

    A metal representation of the structure of DNA.

    Enlarge (credit: Adrienne Bresnahan )

    The research community is excited about the potential of DNA to function as long-term archival storage. That's largely because it's extremely dense, chemically stable for tens of thousands of years, and comes in a format we're unlikely to forget how to read. While there has been some interesting progress , efforts have mostly stayed in the research community because of the high costs and extremely slow read and write speeds. These are problems that need to be solved before DNA-based storage can be practical.

    So we were surprised to hear that storage giant Seagate had entered into a collaboration with a DNA-based storage company called Catalog. To find out how close the company's technology is to being useful, we talked to Catalog's CEO, Hyunjun Park. Park indicated that Catalog's approach is counterintuitive on two levels: It doesn't store data the way you'd expect, and it isn't focusing on archival storage at all.

    A different sort of storage

    DNA is a molecule that can be thought of as a linear array of bases, with each base being one of four distinct chemicals: A, T, C, or G. Typically, each base of the DNA molecule is used to hold two bits of information, with the bit values conveyed by the specific base that is present. So A can encode 00, T can encode 01, C can encode 10, and G can encode 11; with this encoding, the molecule AA would store 0000, while AC would store 0010, and so on. We can synthesize DNA molecules hundreds of bases long with high efficiency, and we can add flanking sequences that provide the equivalent of file system information, telling us which part of a chunk of binary data an individual piece of DNA represents.

    Read 12 remaining paragraphs | Comments

    • chevron_right

      Quantum computer succeeds where a classical algorithm fails

      news.movim.eu / ArsTechnica · Thursday, 9 June, 2022 - 18:00 · 1 minute

    Image of a chip above iridescent wiring.

    Enlarge / Google's Sycamore processor. (credit: Google )

    People have performed many mathematical proofs to show that a quantum computer will vastly outperform traditional computers on a number of algorithms. But the quantum computers we have now are error-prone and don't have enough qubits to allow for error correction. The only demonstrations we've had involve quantum computing hardware evolving out of a random configuration and traditional computers failing to simulate their normal behavior. Useful calculations are an exercise for the future.

    But a new paper from Google's quantum computing group has now moved beyond these sorts of demonstrations and used a quantum computer as part of a system that can help us understand quantum systems in general, rather than the quantum computer. And they show that, even on today's error-prone hardware, the system can outperform classical computers on the same problem.

    Probing quantum systems

    To understand what the new work involves, it helps to step back and think about how we typically understand quantum systems. Since the behavior of these systems is probabilistic, we typically need to measure them repeatedly. The results of these measurements are then imported into a classical computer, which processes them to generate a statistical understanding of the system's behavior. With a quantum computer, by contrast, it can be possible to mirror a quantum state using the qubits themselves, reproduce it as often as needed, and manipulate it as necessary. This method has the potential to provide a route to a more direct understanding of the quantum system at issue.

    Read 17 remaining paragraphs | Comments

    • chevron_right

      Making blockchain stop wasting energy by getting it to manage energy

      news.movim.eu / ArsTechnica · Sunday, 5 June, 2022 - 12:08

    Image of solar panels.

    Enlarge / Managing a microgram might be a case where blockchain is actually useful. (credit: Getty Images )

    One of the worst features of blockchain technologies like cryptocurrency and NFTs is their horrific energy use. When we should be wringing every bit of efficiency out of our electricity use, most blockchains require computers to perform pointless calculations repeatedly.

    The obvious solution is to base blockchains on useful calculations—something we might need to do anyway. Unfortunately, the math involved in a blockchain has to have a very specific property: The solution must be difficult to calculate but easy to verify. Nevertheless, a number of useful calculations have been identified as possible replacements for the ones currently being used in many systems.

    A paper released this week adds another option to this list. Optimization problems are notoriously expensive in terms of computations, but the quality of a solution is relatively easy to evaluate. And in this case, the systems being optimized are small energy grids, meaning that this approach could partly offset some of a blockchain's horrific energy usage.

    Read 11 remaining paragraphs | Comments

    • chevron_right

      Manipulating photons for microseconds tops 9,000 years on a supercomputer

      news.movim.eu / ArsTechnica · Wednesday, 1 June, 2022 - 23:12 · 1 minute

    Given an actual beam of light, a beamsplitter divides it in two. Given individual photons, the behavior becomes more complicated.

    Enlarge / Given an actual beam of light, a beamsplitter divides it in two. Given individual photons, the behavior becomes more complicated. (credit: Wikipedia )

    Ars Technica's Chris Lee has spent a good portion of his adult life playing with lasers, so he's a big fan of photon-based quantum computing. Even as various forms of physical hardware like superconducting wires and trapped ions made progress, it was possible to find him gushing about an optical quantum computer put together by a Canadian startup called Xanadu. But, in the year since Xanadu described its hardware, companies using that other technology continued to make progress by cutting down error rates , exploring new technologies , and upping the qubit count .

    But the advantage of optical quantum computing didn't go away, and now Xanadu is back with a reminder that it hasn't gone away either. Thanks to some tweaks to the design it described a year ago, Xanadu is now able to sometimes perform operations with more than 200 qubits. And it's shown that simulating the behavior of just one of those operations on a supercomputer would take 9,000 years, while its optical quantum computer can do them in just a few dozen milliseconds.

    This is an entirely contrived benchmark: just as Google's quantum computer did , the quantum computer is just being itself while the supercomputer is trying to simulate it. The news here is more about the potential of Xanadu's hardware to scale.

    Read 14 remaining paragraphs | Comments