Close Menu
    Facebook X (Twitter) Instagram
    SciTechDaily
    • Biology
    • Chemistry
    • Earth
    • Health
    • Physics
    • Science
    • Space
    • Technology
    Facebook X (Twitter) Pinterest YouTube RSS
    SciTechDaily
    Home»Technology»How 1,432 GPUs Cracked Google’s 53-Qubit Quantum Computer
    Technology

    How 1,432 GPUs Cracked Google’s 53-Qubit Quantum Computer

    By Science China PressApril 21, 20251 Comment3 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn WhatsApp Email Reddit
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email Reddit
    Google Dilution Refrigerator
    A view of the Google dilution refrigerator, which houses the Sycamore chip. Credit: Google Quantum AI

    Researchers have achieved a major leap in quantum computing by simulating Google’s 53-qubit Sycamore circuit using over 1,400 GPUs and groundbreaking algorithmic techniques.

    Their efficient tensor network methods and clever “top-k” sampling approach drastically reduce the memory and computational load needed for accurate simulations. These strategies were validated with smaller test circuits and could shape the future of quantum research, pushing the boundaries of what classical systems can simulate.

    Simulating Google’s Quantum Circuit

    A team of researchers has reached a major milestone in quantum computing by successfully simulating Google’s 53-qubit, 20-layer Sycamore quantum circuit. This was accomplished using 1,432 NVIDIA A100 GPUs and highly optimized parallel algorithms, opening new doors for simulating quantum systems on classical hardware.

    Quantum Random Circuit Schematic Diagram
    Figure 1. A schematic diagram of a quantum random circuit. Credit: ©Science China Press

    Innovations in Tensor Network Algorithms

    At the core of this achievement are advanced tensor network contraction techniques, which efficiently estimate the output probabilities of quantum circuits. To make the simulation feasible, the researchers used slicing strategies to break the full tensor network into smaller, more manageable parts. This significantly reduced memory demands while preserving computational efficiency — making it possible to simulate large quantum circuits with comparatively modest resources.

    The team also used a “top-k” sampling method, which selects the most probable bitstrings from the simulation output. By focusing only on these high-probability results, they improved the linear cross-entropy benchmark (XEB) — a key measure of how closely the simulation matches expected quantum behavior. This not only boosted simulation accuracy but also reduced the computational load, making the process faster and more scalable.

    Sycamore Circuit Sampling Performance
    Figure 2. Performance of implementations of sampling the Sycamore circuit. Credit: ©Science China Press

    Validating with Smaller Circuits

    To validate their algorithm, the researchers conducted numerical experiments with smaller-scale random circuits, including a 30-qubit, 14-layer gate circuit. The results demonstrated excellent agreement with theoretically predicted XEB values for various tensor contraction sub-network sizes. The top-k method’s enhancement of the XEB value closely aligned with theoretical predictions, affirming the accuracy and efficiency of the algorithm.

    Sycamore Circuit Probabilities Histogram
    Figure 3. Histogram of the probabilities of 3 million post processed samples from the Sycamore circuit with 53 qubits and 20 cycles. Credit: ©Science China Press

    Streamlining Tensor Contraction Performance

    The study also highlighted strategies for optimizing tensor contraction resource requirements. By refining the order of tensor indices and minimizing inter-GPU communication, the team achieved notable improvements in computational efficiency. This strategy also demonstrates, based on complexity estimates, that increasing memory capacity — such as 80GB, 640GB, and 5120GB — can significantly reduce computational time complexity. The use of 8×80 GB memory configurations per computational node enabled high-performance computing.

    Future of Quantum Simulations

    This breakthrough not only establishes a new benchmark for classical simulations of multi-qubit quantum computers but also introduces innovative tools and methodologies for future quantum computing research. By continuing to refine algorithms and optimize computational resources, the researchers anticipate making substantial progress in simulating larger quantum circuits with more qubits. This work represents a significant advancement in quantum computing, offering valuable insights for the ongoing development of quantum technologies.

    Reference: “Leapfrogging Sycamore: harnessing 1432 GPUs for 7× faster quantum random circuit sampling” by Xian-He Zhao, Han-Sen Zhong, Feng Pan, Zi-Han Chen, Rong Fu, Zhongling Su, Xiaotong Xie, Chaoxing Zhao, Pan Zhang, Wanli Ouyang, Chao-Yang Lu, Jian-Wei Pan and Ming-Cheng Chen, 12 September 2024, National Science Review.
    DOI: 10.1093/nsr/nwae317

    Never miss a breakthrough: Join the SciTechDaily newsletter.

    Google Popular Quantum Computing Quantum Mechanics Science China Press
    Share. Facebook Twitter Pinterest LinkedIn Email Reddit

    Related Articles

    Quantum Error Correction: Shattering the Breakeven Barrier

    Researchers “Split” Phonons in Step Toward New Type of Linear Mechanical Quantum Computer

    Google Quantum AI Braids Non-Abelian Anyons – A Breakthrough That Could Revolutionize Quantum Computing

    Fundamental Breakthrough: Error-Free Quantum Computing Gets Real

    Stanford and Google Team Up To Create Time Crystals With Quantum Computers

    Russian Scientists Break Google’s Quantum Algorithm

    Quantum Supremacy Achieved by NASA and Google

    A New Kind of Quantum Computer Uses Photons as Qubits

    Electronic Read-Out of the Quantum State of an Atom

    1 Comment

    1. kamir bouchareb st on April 22, 2025 10:26 am

      thank you for this

      Reply
    Leave A Reply Cancel Reply

    • Facebook
    • Twitter
    • Pinterest
    • YouTube

    Don't Miss a Discovery

    Subscribe for the Latest in Science & Tech!

    Trending News

    Ultra-Processed Foods Add Fat Without Extra Calories and Disrupt Hormones

    Tiny Lab-Grown Spinal Cords Could Hold the Key to Healing Paralysis

    “Alien Aurora” – Scientists Spot Never-Before-Seen Plasma Waves in Jupiter’s Polar Lights

    41,000 Years Ago, Something Weird in Space Changed How Humans Lived on Earth

    New Pill Dramatically Lowers Dangerous High Blood Pressure

    JWST May Have Found the Universe’s First Pristine Galaxy

    Warning: Common Food Ingredients, Including Caffeine, Weaken Antibiotics

    Weight Loss Breakthrough: Scientists Develop Edible “Fat Sponges” From Green Tea and Seaweed

    Follow SciTechDaily
    • Facebook
    • Twitter
    • YouTube
    • Pinterest
    • Newsletter
    • RSS
    SciTech News
    • Biology News
    • Chemistry News
    • Earth News
    • Health News
    • Physics News
    • Science News
    • Space News
    • Technology News
    Recent Posts
    • Binge-Watching Might Actually Be Good for You, Study Finds
    • Weight-Loss Drug Mounjaro Shrinks Breast Cancer Tumors in Mice
    • A New Weapon Against Cancer: Cold Plasma Destroys Hidden Tumor Cells
    • Starving Cancer: New Diet Slows Growth of Deadliest Brain Tumors in Mice
    • This Popular Sport Is Linked to an Increased Risk of Dementia
    Copyright © 1998 - 2025 SciTechDaily. All Rights Reserved.
    • Science News
    • About
    • Contact
    • Editorial Board
    • Privacy Policy
    • Terms of Use

    Type above and press Enter to search. Press Esc to cancel.