Close Menu
    Facebook X (Twitter) Instagram
    SciTechDaily
    • Biology
    • Chemistry
    • Earth
    • Health
    • Physics
    • Science
    • Space
    • Technology
    Facebook X (Twitter) Pinterest YouTube RSS
    SciTechDaily
    Home»Technology»AI Can Now Learn 100x Faster Without Wasting Energy
    Technology

    AI Can Now Learn 100x Faster Without Wasting Energy

    By Technical University of Munich (TUM)March 10, 20253 Comments3 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn WhatsApp Email Reddit
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email Reddit
    Artificial Intelligence Neural Network Concept
    A faster, smarter AI training method cuts energy use without sacrificing accuracy—ushering in a more sustainable future for AI.

    AI is consuming more energy than ever, with data centers struggling to keep up with demand. A breakthrough training method could change everything, slashing energy use while maintaining accuracy.

    By shifting from traditional iterative training to a probability-based approach, researchers have found a way to optimize neural networks with far less computation. This innovation, inspired by dynamic systems found in nature, has the potential to make AI much greener—without sacrificing performance.

    AI’s Growing Energy Appetite

    AI technologies, including large language models (LLMs), have become an essential part of daily life. However, the computing power needed to support them comes from data centers that consume vast amounts of energy. In Germany alone, data centers used approximately 16 billion kilowatt-hours (kWh) of electricity in 2020—about 1% of the country’s total energy consumption. By 2025, this number is projected to rise to 22 billion kWh.

    New Method: 100x Faster, Similar Accuracy

    As AI applications grow more complex, their energy demands will continue to rise, particularly for training neural networks, which require enormous computational resources. To address this challenge, researchers have developed a new training method that is 100 times faster than conventional approaches while maintaining the same level of accuracy. This breakthrough has the potential to significantly reduce the energy required for AI training.

    Neural networks, which power AI tasks like image recognition and language processing, are modeled after the human brain. They consist of interconnected nodes, or artificial neurons, that process information by assigning weighted values to input signals. When a certain threshold is reached, the signal is passed to the next layer of nodes.

    Training these networks is computationally intensive. Initially, parameter values are assigned randomly, often using a normal distribution. The system then repeatedly adjusts these values over many iterations to improve prediction accuracy. Because of the vast number of calculations involved, training neural networks consumes substantial amounts of electricity.

    Smarter Training with Probability-Based Parameters

    Felix Dietrich, a professor of Physics-enhanced Machine Learning, and his team have developed a new method. Instead of iteratively determining the parameters between the nodes, their approach uses probabilities. Their probabilistic method is based on the targeted use of values at critical locations in the training data where large and rapid changes in values are taking place.

    The objective of the current study is to use this approach to acquire energy-conserving dynamic systems from the data. Such systems change over the course of time in accordance with certain rules and are found in climate models and in financial markets, for example.

    Energy Efficiency Without Compromising Accuracy

    “Our method makes it possible to determine the required parameters with minimal computing power. This can make the training of neural networks much faster and, as a result, more energy efficient,” says Felix Dietrich. “In addition, we have seen that the accuracy of the new method is comparable to that of iteratively trained networks.”

    References:

    “Training Hamiltonian Neural Networks without Backpropagation” by Rahma, Atamert, Chinmay Datar and Felix Dietrich, 2024 Machine Learning and the Physical Sciences Workshop at the 38th conference on Neural Information Processing Systems (NeurIPS).
    https://neurips.cc/virtual/2024/99994

    “Sampling Weights of Deep Neural Networks” by Erik Lien Bolager, Iryna Burak, Chinmay Datar, Qing Sun, Felix Dietrich, 2023, Advances in Neural Information Processing Systems.
    https://proceedings.neurips.cc/paper_files/paper/2023/hash/c7201deff8d507a8fe2e86d34094e154-Abstract-Conference.html

    Never miss a breakthrough: Join the SciTechDaily newsletter.
    Follow us on Google and Google News.

    Artificial Intelligence Popular Technical University of Munich
    Share. Facebook Twitter Pinterest LinkedIn Email Reddit

    Related Articles

    Widely Used AI Machine Learning Methods Don’t Work as Claimed

    Researchers Develop a Machine Capable of Solving Complex Problems in Theoretical Physics

    Artificial Intelligence System Learns the Fundamental Laws of Quantum Mechanics

    Human Brain-Like Functions Emerge in Neuromorphic Metallic Nanowire Network

    Researchers Find Way to Harness AI Creativity – Dramatic Performance Boost to Deep Learning

    Neuroscientist: Animal Brains Key for Next Generation of Artificial Intelligence

    New AI System Identifies Personality Traits from Eye Movements

    TrueNorth Computer Chip Emulates Human Cognition

    AI Framework Predicts Better Patient Health Care and Reduces Cost

    3 Comments

    1. Laura Simmons on March 10, 2025 6:57 pm

      This article does an excellent job of explaining the innovative new method for training AI models more efficiently, cutting down energy consumption while maintaining accuracy. The detailed breakdown of how this new probabilistic approach works provides valuable insight into a significant breakthrough in AI sustainability.

      Reply
    2. Boba on March 11, 2025 1:35 am

      “The same systems are found in climate models and financial models…”

      Which means they’re crap.

      Reply
      • Randy on March 22, 2025 10:31 pm

        Nope…..I wrote out a more likely synopsis, of why , but this comment page , ate my reply , before could hit send…

        But to be brief , we daily use likely outcome based probability results , in our daily lives….Just the reason many people , are afraid of changing their known processes , ways of thought even. Sticking with older tech , as they are unsure about learning new ways of doing , or to lazy to figure out newer tech even…

        Such as many are more comfortable using ice engine cars , than a tech they know little about , ev cars….Thus they set their mind on never ever wanting to change…

        Reply
    Leave A Reply Cancel Reply

    • Facebook
    • Twitter
    • Pinterest
    • YouTube

    Don't Miss a Discovery

    Subscribe for the Latest in Science & Tech!

    Trending News

    One of the Universe’s Largest Stars May Be Getting Ready To Explode

    Scientists Discover Enzyme That Could Supercharge Ozempic-Like Weight Loss Drugs

    Popular Sweetener Linked to DNA Damage – “It’s Something You Should Not Be Eating”

    Ancient “Rock” Microbes May Reveal How Complex Life Began

    Researchers Capture Quantum Interference in One of Nature’s Rarest Atoms

    “A Plague Is Upon Us”: The Mass Death That Changed an Ancient City Forever

    Scientists Discover Game-Changing New Way To Treat High Cholesterol

    This Small Change to Your Exercise Routine Could Be the Secret to Living Longer

    Follow SciTechDaily
    • Facebook
    • Twitter
    • YouTube
    • Pinterest
    • Newsletter
    • RSS
    SciTech News
    • Biology News
    • Chemistry News
    • Earth News
    • Health News
    • Physics News
    • Science News
    • Space News
    • Technology News
    Recent Posts
    • Scientists Prove There Are Just Six Degrees of Separation in a Social Network
    • Bee Bacteria Could Fix a Major Flaw in Plant-Based Milk
    • Scientists Discover a Surprising Way To Make Bread Healthier and More Nutritious
    • Natural Compounds Boost Bone Implant Success While Killing Bacteria and Cancer Cells
    • After 60 Years, Scientists Uncover Unexpected Brain Effects of Popular Diabetes Drug Metformin
    Copyright © 1998 - 2026 SciTechDaily. All Rights Reserved.
    • Science News
    • About
    • Contact
    • Editorial Board
    • Privacy Policy
    • Terms of Use

    Type above and press Enter to search. Press Esc to cancel.