
A new study suggests that language may rely less on complex grammar than previously thought.
Every time we speak, we’re improvising.
“Humans possess a remarkable ability to talk about almost anything, sometimes putting words together into never-before-spoken or -written sentences,” said Morten H. Christiansen, the William R. Kenan, Jr. Professor of Psychology in the College of Arts and Sciences.
According to language scientists, this flexibility comes from internal mental representations that help people recognize patterns in language and combine words into meaningful statements. While this ability is fundamental to communication, scientists are still working to understand exactly what those mental patterns look like and how they function, Christiansen said.
In a new study, Christiansen and co-author Yngwie A. Nielsen of Aarhus University present a different way of thinking about how language is represented in the mind. Their work questions the long-held belief that language depends on highly complex grammatical structures. Although the research focused on English, the authors suggest the results may apply to many languages and could influence future research on how language evolves, how children learn to speak, and how adults acquire new languages.
From Grammatical Trees to LEGO-Like Building Blocks
For many years, researchers have assumed that sentence construction depends on an internal grammar that organizes words into layered, hierarchical structures, similar to a branching tree. Christiansen and Nielsen propose a simpler alternative. They suggest that language may rely more on combining familiar building blocks, much like assembling pre-made LEGO pieces (such as a door frame or a wheel set) into a finished structure.
Under this view, speakers draw on short, linear sequences of word types, including nouns and verbs, rather than relying entirely on abstract grammatical rules. Some of these sequences do not fit neatly into traditional grammar at all, such as “in the middle of the” or “wondered if you.”
Their study was published in the journal Nature Human Behaviour on January 21.
Since at least the 1950s, the dominant theory in linguistics has emphasized hierarchical mental structures as a defining feature of human language, Christiansen said. This framework suggests that words and phrases are combined according to grammatical principles into larger units known as constituents. For instance, in the sentence “She ate the cake,” the words “the” and “cake” form the noun phrase “the cake”. That phrase then joins with “ate” to create the verb phrase “ate the cake,” which finally combines with “she” to form a complete sentence.
“But not all sequences of words form constituents,” Christiansen and Nielsen wrote in a summary of their paper. “In fact, the most common three- or four-word sequences in language are often nonconstituents, such as ‘can I have a’ or ‘it was in the.’”
The Hidden Role of Nonconstituent Sequences
Because they don’t conform to grammar, nonconstituent sequences have been overlooked. But they do play a role in a speaker’s knowledge of their language, the researchers found.
In experiments, an eye-tracking study and an analysis of phone conversations, they discovered that linear sequences of word classes can be “primed,” meaning when we hear or read them once, we process them faster the next time. That’s compelling evidence they’re part of our mental representation of language, Christiansen said. In other words, they’re a key part of our mental representation of language that goes beyond the rules of grammar.
“I think the main contribution is showing that traditional rules of grammar cannot capture all of the mental representations of language structure,” Nielsen said.
“It might even be possible to account for how we use language in general with flatter structure,” Christiansen said. “Importantly, if you don’t need the more complex machinery of hierarchical syntax, then this could mean that the gulf between human language and other animal communication systems is much smaller than previously thought.”
Reference: “Evidence for the representation of non-hierarchical structures in language” by Yngwie A. Nielsen, and Morten H. Christiansen, 21 January 2026, Nature Human Behaviour.
DOI: 10.1038/s41562-025-02387-z
Never miss a breakthrough: Join the SciTechDaily newsletter.
Follow us on Google and Google News.
18 Comments
I was doing research and work in the computer lab at the local UC college library for a book I was writing. People were friendly and one conversation was with three seniors graduating in Psychology. They talked about their own ‘therapy’ – and replied to my surprise, “Oh yeah, we’re all nuts!” smiling brightly, “Everyone in ‘Psych’ is nutty, we’ve all had therapy, that’s how we became interested in Psychology.”
English; a mongrel language that accommodates appalling but understandable grammar when spoken.
Donald Trump speaks English.
Is that an opinion based on scientific research? Or, is it just a thinly-veiled attempt to justify a jab at someone whom you dislike?
bet.
On what?
tbh, the construction “accommodates appalling but understandable grammar” is somewhat difficult to interpret—if it’s both understandable & “accommodated”, whence “appalling”? what would it mean for a language to *not* accommodate appalling-but-understandable grammar?—and I’m not too sure of the significance of “when spoken” (does it *not* “accommodate” appalling-but-understandable grammar when written?) here, either…
(…but if you’re determined to stand by the whole statement, then—just FYI—you need a dash or colon after “English”, rather than a semicolon.)
“..sometimes putting words together into never-before-spoken or -written sentences,”
I am proud of my HS studies in Latin which occasionally allows me to craft words that I have totally invented, but I am certain SHOULD mean thus or so.
On what?
I’m willing to bet that if I were to ask the authors of this study what a “linear sequence” of a word class is, they would be unable to answer. If I were to ask them what’s meant by “flatter structure”, they would be equally unable to define. If I were to ask any one of them what made them think that grammar rules play a part in one’s mental representation of his language, I’m certain that a dull stare would be returned to me, despite one of the authors using that exact statement in this article.
And if you thought that the study authors were bad, they’re nothing compared to the dimly-lit Cornell student who actually wrote this thing. In one sentence he refers to word class, another he says word type. And just try to get the poor sap to define sentence constituents. He’s utterly hopeless!
Of course our human languages are not that far off from animal communication.
Dog.
Bark (angry sentences)
Ngrrrr (warning)
Whimper (cold, aching)
Howl (shouts and screams)
Chicken.
Crow (Tourette’s)
Clucks (come eat)
Korokotokkorokotok (flirty talk)
Kutakkutakkutak (boy that scared me)
Cat.
Hiss (warning and insults)
Meow (declarative sentence)
Ngoowww (now! Not now!)
Eyoww ( catfight language)
Orcas are probably experts in zingers, insults and bullying.
Odd that Chomsky isn’t mentioned here.
I was thinking the same thing. I was wondering about the “we” in the article’s title (“Have We Been Wrong About Language for 70 Years?”). Why “we”? The title should read, “Doubt thrown on Chomsky’s theory of language.”
Not really. Chomsky produces the biggest word salads on the planet.
“Their work questions the long-held belief that language depends on highly complex grammatical structures.”
Clear that sentence was not. Verbing nouns an abomination is.
Looks like Chomsky’s out of a job.
He’s turned from peddling linguistic nonsense to pushing political nonsense long ago
Translation: Gracias/grassy ass. Thoughts prevoking further research is needed.
Grammar is built on logic and Literatue, cultural and intellectual information, a principle frame for Cultures and divergent thinking. Linear expression has two functions, which come from human instinct or rapid/fast reaction in order to adapt o the personal, enviromental, managerial need …