Close Menu
    Facebook X (Twitter) Instagram
    SciTechDaily
    • Biology
    • Chemistry
    • Earth
    • Health
    • Physics
    • Science
    • Space
    • Technology
    Facebook X (Twitter) Pinterest YouTube RSS
    SciTechDaily
    Home»Science»Have We Been Wrong About Language for 70 Years? New Study Challenges Long-Held Theory
    Science

    Have We Been Wrong About Language for 70 Years? New Study Challenges Long-Held Theory

    By Cornell UniversityJanuary 23, 202618 Comments4 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn WhatsApp Email Reddit
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email Reddit
    Man Speaking Language Letters
    Humans can produce endlessly new sentences, but the mental structures that make this possible may be simpler than long assumed. New research suggests that language relies not only on complex grammatical hierarchies, but also on frequently used linear patterns that shape how we process and understand speech. Credit: Shutterstock

    A new study suggests that language may rely less on complex grammar than previously thought.

    Every time we speak, we’re improvising.

    “Humans possess a remarkable ability to talk about almost anything, sometimes putting words together into never-before-spoken or -written sentences,” said Morten H. Christiansen, the William R. Kenan, Jr. Professor of Psychology in the College of Arts and Sciences.

    According to language scientists, this flexibility comes from internal mental representations that help people recognize patterns in language and combine words into meaningful statements. While this ability is fundamental to communication, scientists are still working to understand exactly what those mental patterns look like and how they function, Christiansen said.

    In a new study, Christiansen and co-author Yngwie A. Nielsen of Aarhus University present a different way of thinking about how language is represented in the mind. Their work questions the long-held belief that language depends on highly complex grammatical structures. Although the research focused on English, the authors suggest the results may apply to many languages and could influence future research on how language evolves, how children learn to speak, and how adults acquire new languages.

    From Grammatical Trees to LEGO-Like Building Blocks

    For many years, researchers have assumed that sentence construction depends on an internal grammar that organizes words into layered, hierarchical structures, similar to a branching tree. Christiansen and Nielsen propose a simpler alternative. They suggest that language may rely more on combining familiar building blocks, much like assembling pre-made LEGO pieces (such as a door frame or a wheel set) into a finished structure.

    Under this view, speakers draw on short, linear sequences of word types, including nouns and verbs, rather than relying entirely on abstract grammatical rules. Some of these sequences do not fit neatly into traditional grammar at all, such as “in the middle of the” or “wondered if you.”

    Their study was published in the journal Nature Human Behaviour on January 21.

    Since at least the 1950s, the dominant theory in linguistics has emphasized hierarchical mental structures as a defining feature of human language, Christiansen said. This framework suggests that words and phrases are combined according to grammatical principles into larger units known as constituents. For instance, in the sentence “She ate the cake,” the words “the” and “cake” form the noun phrase “the cake”. That phrase then joins with “ate” to create the verb phrase “ate the cake,” which finally combines with “she” to form a complete sentence.

    “But not all sequences of words form constituents,” Christiansen and Nielsen wrote in a summary of their paper. “In fact, the most common three- or four-word sequences in language are often nonconstituents, such as ‘can I have a’ or ‘it was in the.’”

    The Hidden Role of Nonconstituent Sequences

    Because they don’t conform to grammar, nonconstituent sequences have been overlooked. But they do play a role in a speaker’s knowledge of their language, the researchers found.

    In experiments, an eye-tracking study and an analysis of phone conversations, they discovered that linear sequences of word classes can be “primed,” meaning when we hear or read them once, we process them faster the next time. That’s compelling evidence they’re part of our mental representation of language, Christiansen said. In other words, they’re a key part of our mental representation of language that goes beyond the rules of grammar.

    “I think the main contribution is showing that traditional rules of grammar cannot capture all of the mental representations of language structure,” Nielsen said.

    “It might even be possible to account for how we use language in general with flatter structure,” Christiansen said. “Importantly, if you don’t need the more complex machinery of hierarchical syntax, then this could mean that the gulf between human language and other animal communication systems is much smaller than previously thought.”

    Reference: “Evidence for the representation of non-hierarchical structures in language” by Yngwie A. Nielsen, and Morten H. Christiansen, 21 January 2026, Nature Human Behaviour.
    DOI: 10.1038/s41562-025-02387-z

    Never miss a breakthrough: Join the SciTechDaily newsletter.
    Follow us on Google and Google News.

    Cognitive Science Cornell University Language Linguistics Popular
    Share. Facebook Twitter Pinterest LinkedIn Email Reddit

    Related Articles

    The End of Language As We Know It? Scientists Challenge 60 Years of Linguistic Research

    Linguists Tested 191 Universal Grammar Rules. Only One-Third Survived

    Why We Don’t Talk Like Computers: Scientists Finally Have an Answer

    Children with Cochlear Implants Learn Words Faster Than Children With Normal Hearing

    New Study Shows Fear of Spiders and Snakes is Deeply Embedded in Humans

    Neuroscientists Identify a Protein That Allows Brain Cells to Dampen Their Sensitivity

    Disease Mapping Methods Indicate That Indo-European Languages Originated From Anatolia

    Linguistics Research May Improve Future Internet Search Engines

    Time Cloak Creates Hole in Time, Makes Events Disappear

    18 Comments

    1. Robert on January 23, 2026 8:04 am

      I was doing research and work in the computer lab at the local UC college library for a book I was writing. People were friendly and one conversation was with three seniors graduating in Psychology. They talked about their own ‘therapy’ – and replied to my surprise, “Oh yeah, we’re all nuts!” smiling brightly, “Everyone in ‘Psych’ is nutty, we’ve all had therapy, that’s how we became interested in Psychology.”

      Reply
    2. rob on January 23, 2026 3:48 pm

      English; a mongrel language that accommodates appalling but understandable grammar when spoken.

      Donald Trump speaks English.

      Reply
      • Clyde Spencer on January 24, 2026 5:25 pm

        Is that an opinion based on scientific research? Or, is it just a thinly-veiled attempt to justify a jab at someone whom you dislike?

        Reply
    3. fitzceros on January 23, 2026 4:21 pm

      bet.

      Reply
      • Constellar on January 24, 2026 2:06 pm

        On what?

        Reply
    4. Bors on January 24, 2026 5:19 am

      tbh, the construction “accommodates appalling but understandable grammar” is somewhat difficult to interpret—if it’s both understandable & “accommodated”, whence “appalling”? what would it mean for a language to *not* accommodate appalling-but-understandable grammar?—and I’m not too sure of the significance of “when spoken” (does it *not* “accommodate” appalling-but-understandable grammar when written?) here, either…

      (…but if you’re determined to stand by the whole statement, then—just FYI—you need a dash or colon after “English”, rather than a semicolon.)

      Reply
    5. Eric M. Jones on January 24, 2026 8:20 am

      “..sometimes putting words together into never-before-spoken or -written sentences,”

      I am proud of my HS studies in Latin which occasionally allows me to craft words that I have totally invented, but I am certain SHOULD mean thus or so.

      Reply
    6. Constellar on January 24, 2026 2:06 pm

      On what?

      Reply
    7. Constellar on January 24, 2026 2:22 pm

      I’m willing to bet that if I were to ask the authors of this study what a “linear sequence” of a word class is, they would be unable to answer. If I were to ask them what’s meant by “flatter structure”, they would be equally unable to define. If I were to ask any one of them what made them think that grammar rules play a part in one’s mental representation of his language, I’m certain that a dull stare would be returned to me, despite one of the authors using that exact statement in this article.
      And if you thought that the study authors were bad, they’re nothing compared to the dimly-lit Cornell student who actually wrote this thing. In one sentence he refers to word class, another he says word type. And just try to get the poor sap to define sentence constituents. He’s utterly hopeless!

      Reply
      • Rick Gabriel on January 24, 2026 8:21 pm

        Of course our human languages are not that far off from animal communication.

        Dog.
        Bark (angry sentences)
        Ngrrrr (warning)
        Whimper (cold, aching)
        Howl (shouts and screams)

        Chicken.
        Crow (Tourette’s)
        Clucks (come eat)
        Korokotokkorokotok (flirty talk)
        Kutakkutakkutak (boy that scared me)

        Cat.
        Hiss (warning and insults)
        Meow (declarative sentence)
        Ngoowww (now! Not now!)
        Eyoww ( catfight language)

        Orcas are probably experts in zingers, insults and bullying.

        Reply
    8. Manny Corpus on January 24, 2026 3:41 pm

      Odd that Chomsky isn’t mentioned here.

      Reply
      • pinguino on January 24, 2026 5:54 pm

        I was thinking the same thing. I was wondering about the “we” in the article’s title (“Have We Been Wrong About Language for 70 Years?”). Why “we”? The title should read, “Doubt thrown on Chomsky’s theory of language.”

        Reply
      • Gigi on January 24, 2026 9:46 pm

        Not really. Chomsky produces the biggest word salads on the planet.

        Reply
    9. Clyde Spencer on January 24, 2026 5:34 pm

      “Their work questions the long-held belief that language depends on highly complex grammatical structures.”

      Clear that sentence was not. Verbing nouns an abomination is.

      Reply
    10. Chas Lejardin on January 24, 2026 7:18 pm

      Looks like Chomsky’s out of a job.

      Reply
      • Not a Communist on January 24, 2026 8:15 pm

        He’s turned from peddling linguistic nonsense to pushing political nonsense long ago

        Reply
    11. Travesty on January 24, 2026 10:13 pm

      Translation: Gracias/grassy ass. Thoughts prevoking further research is needed.

      Reply
    12. Aug on January 26, 2026 2:59 am

      Grammar is built on logic and Literatue, cultural and intellectual information, a principle frame for Cultures and divergent thinking. Linear expression has two functions, which come from human instinct or rapid/fast reaction in order to adapt o the personal, enviromental, managerial need …

      Reply
    Leave A Reply Cancel Reply

    • Facebook
    • Twitter
    • Pinterest
    • YouTube

    Don't Miss a Discovery

    Subscribe for the Latest in Science & Tech!

    Trending News

    Collapsing Plasma May Hold the Key to Cosmic Magnetism

    This Breakthrough Solar Panel Generates Power From Both Sunlight and Raindrops

    Scientists Uncover New Metabolic Effects Beyond Weight Loss of Mounjaro

    Scientists Discover Cancer Tumors Are “Addicted” to This Common Antioxidant

    1,800 Miles Down: Scientists Uncover Mysterious Movements at the Edge of Earth’s Core

    Scientists Discover Hidden “Good Fats” in Green Rice That Could Transform Nutrition

    Your Child’s Clothes Could Contain Toxic Lead, Study Finds

    Researchers Break a 150-Year-Old Math Law With a Surprising Donut Discovery

    Follow SciTechDaily
    • Facebook
    • Twitter
    • YouTube
    • Pinterest
    • Newsletter
    • RSS
    SciTech News
    • Biology News
    • Chemistry News
    • Earth News
    • Health News
    • Physics News
    • Science News
    • Space News
    • Technology News
    Recent Posts
    • Scientists Uncover the Secret Ingredient Behind the Spark That May Have Started Life on Earth
    • Natural Oils vs. Antibiotics: The Swine Study That Could Change Farming
    • The Biggest Volcanic Event in Earth’s History Transformed an Entire Oceanic Plate
    • Scientists Warn: Humanity Has Pushed the Planet Past Its Limits
    • Stronger Flu Shot Linked to Nearly 55% Lower Alzheimer’s Risk, Study Finds
    Copyright © 1998 - 2026 SciTechDaily. All Rights Reserved.
    • Science News
    • About
    • Contact
    • Editorial Board
    • Privacy Policy
    • Terms of Use

    Type above and press Enter to search. Press Esc to cancel.