Close Menu
    Facebook X (Twitter) Instagram
    SciTechDaily
    • Biology
    • Chemistry
    • Earth
    • Health
    • Physics
    • Science
    • Space
    • Technology
    Facebook X (Twitter) Pinterest YouTube RSS
    SciTechDaily
    Home»Technology»AI Fails the Social Test: New Study Reveals Major Blind Spot
    Technology

    AI Fails the Social Test: New Study Reveals Major Blind Spot

    By Johns Hopkins UniversityMay 5, 20259 Comments4 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn WhatsApp Email Reddit
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email Reddit
    AI Futuristic Cyborg Artificial Intelligence
    AI systems still struggle to understand dynamic social interactions, falling far behind human abilities due to limitations in how these models process complex, real-world scenarios.

    Johns Hopkins study reveals AI models struggle to accurately predict social interactions.

    A recent study led by researchers at Johns Hopkins University reveals that humans outperform current AI models in accurately describing and interpreting social interactions within dynamic scenes. This capability is critical for technologies such as autonomous vehicles and assistive robots, which rely heavily on AI to safely navigate real-world environments.

    The research highlights that existing AI systems struggle to grasp the nuanced social dynamics and contextual cues essential for effectively interacting with people. Furthermore, the findings suggest that this limitation may stem fundamentally from the underlying architecture and infrastructure of current AI models.

    “AI for a self-driving car, for example, would need to recognize the intentions, goals, and actions of human drivers and pedestrians. You would want it to know which way a pedestrian is about to start walking, or whether two people are in conversation versus about to cross the street,” said lead author Leyla Isik, an assistant professor of cognitive science at Johns Hopkins University. “Any time you want an AI to interact with humans, you want it to be able to recognize what people are doing. I think this sheds light on the fact that these systems can’t right now.”

    Kathy Garcia, a doctoral student working in Isik’s lab at the time of the research and co–first author, recently presented the research findings at the International Conference on Learning Representations on April 24.

    Comparing AI and Human Perception

    To determine how AI models measure up compared to human perception, the researchers asked human participants to watch three-second video clips and rate features important for understanding social interactions on a scale of one to five. The clips included people either interacting with one another, performing side-by-side activities, or conducting independent activities on their own.

    The researchers then asked more than 350 AI language, video, and image models to predict how humans would judge the videos and how their brains would respond to watching. For large language models, the researchers had the AIs evaluate short, human-written captions.

    Participants, for the most part, agreed with each other on all the questions; the AI models, regardless of size or the data they were trained on, did not. Video models were unable to accurately describe what people were doing in the videos. Even image models that were given a series of still frames to analyze could not reliably predict whether people were communicating. Language models were better at predicting human behavior, while video models were better at predicting neural activity in the brain.

    A Gap in AI Development

    The results provide a sharp contrast to AI’s success in reading still images, the researchers said.

    “It’s not enough to just see an image and recognize objects and faces. That was the first step, which took us a long way in AI. But real life isn’t static. We need AI to understand the story that is unfolding in a scene. Understanding the relationships, context, and dynamics of social interactions is the next step, and this research suggests there might be a blind spot in AI model development,” Garcia said.

    Researchers believe this is because AI neural networks were inspired by the infrastructure of the part of the brain that processes static images, which is different from the area of the brain that processes dynamic social scenes.

    “There’s a lot of nuances, but the big takeaway is none of the AI models can match human brain and behavior responses to scenes across the board, like they do for static scenes,” Isik said. “I think there’s something fundamental about the way humans are processing scenes that these models are missing.”

    Meeting: International Conference on Learning Representations

    Funding: U.S. National Science Foundation, U.S. National Science Foundation, NIH/National Institute of Mental Health

    Never miss a breakthrough: Join the SciTechDaily newsletter.
    Follow us on Google and Google News.

    Artificial Intelligence Computer Science Johns Hopkins University Machine Learning
    Share. Facebook Twitter Pinterest LinkedIn Email Reddit

    Related Articles

    Machine Learning at Speed: Optimization Code Increases Performance by 5x

    MIT’s New Neural Network: “Liquid” Machine-Learning System Adapts to Changing Conditions

    How AI Sees Through the Looking Glass: Things Are Different on the Other Side of the Mirror

    Widely Used AI Machine Learning Methods Don’t Work as Claimed

    Hunting Down Cybercriminals With New Machine-Learning System

    New AI System Identifies Personality Traits from Eye Movements

    New Artificial Intelligence Device Identifies Objects at the Speed of Light

    Machine-Learning Models Capture Subtle Variations in Facial Expressions

    ‘Deep Learning’ Algorithm Brings New Tools to Astronomy

    9 Comments

    1. kamir bouchareb st on May 5, 2025 2:56 am

      thank you

      Reply
    2. Boba on May 5, 2025 3:45 am

      That’s because the data the AI is being trained on is mainly spooled from Internet forums. And we know that people who frequent the Internet forums are, on a whole, not the best when it comes to social interactions.

      Reply
    3. Rob on May 5, 2025 5:27 pm

      We don’t need AI. End of social interaction.

      Reply
    4. Deividas Strole on May 5, 2025 9:34 pm

      Great news for us humans! 🙂 I consider AI to be a tool that can help humanity in many ways. We don’t need AI to become us…

      Reply
    5. Robert Welch on May 6, 2025 12:41 pm

      Until AI can anticipate that the guy passing you at 85mph is going to cut in front of you, cross three lanes of traffic with his brakes full on, all the while flying you the bird, it should never be used to pilot automobiles. Trucks and busses either, for that matter.

      Reply
      • Thomas Cassick on May 9, 2025 5:35 am

        Especially touring busses.. (if it cannot pass the Turring Test!) (Sorry! I’ll show myself out! LOL!)

        Reply
        • Robert Welch on May 10, 2025 6:45 am

          Bangers, mate.

          Reply
        • Craig on May 12, 2025 3:32 pm

          LOL 😆 floating in middle yeah that dam null null null truthful is true opposite of the gateway showing the what is could be possible ifpossible slow and steady. What are they hiding behind that door or what are they really trying to access for themselves haha

          Reply
    6. Sarah C Tyrrell on May 9, 2025 10:31 am

      This is exactly what Evrostics addresses. … AI needs Evrostics.

      Reply
    Leave A Reply Cancel Reply

    • Facebook
    • Twitter
    • Pinterest
    • YouTube

    Don't Miss a Discovery

    Subscribe for the Latest in Science & Tech!

    Trending News

    Mezcal “Worm” in a Bottle Mystery: DNA Testing Reveals a Surprise

    New Research Reveals That Your Morning Coffee Activates an Ancient Longevity Switch

    This Is What Makes You Irresistible to Mosquitoes

    Shockingly Powerful Giant Octopuses Ruled the Seas 100 Million Years Ago

    Scientists Stunned by New Organic Molecules Found on Mars

    Rewriting Dinosaur Evolution: Scientists Unearth Remarkable 150-Million-Year-Old Stegosaur Skull

    Omega-3 Supplements Linked to Cognitive Decline in Surprising New Study

    First-of-Its-Kind Discovery: Homer’s Iliad Found Embedded in a 1,600-Year-Old Egyptian Mummy

    Follow SciTechDaily
    • Facebook
    • Twitter
    • YouTube
    • Pinterest
    • Newsletter
    • RSS
    SciTech News
    • Biology News
    • Chemistry News
    • Earth News
    • Health News
    • Physics News
    • Science News
    • Space News
    • Technology News
    Recent Posts
    • A Common Nutrient May Play a Surprising Role in Anxiety
    • Doing This After 9 p.m. Could Double Your Risk of Gut Issues
    • New Research Challenges Long-Held Beliefs About How the Brain Makes Decisions
    • Breakthrough Technology Reveals New Treatment Targets for Cancer
    • Scientists Discover New Way To Make Drug-Resistant Cancer Treatable Again
    Copyright © 1998 - 2026 SciTechDaily. All Rights Reserved.
    • Science News
    • About
    • Contact
    • Editorial Board
    • Privacy Policy
    • Terms of Use

    Type above and press Enter to search. Press Esc to cancel.