MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Monday, 23 December 2024

Circuit Of Reason

Many modern scholars have advanced similar views. Starting in the 1960s, Noam Chomsky, a linguist at the Massachusetts Institute of Technology, US, argued that we use language for reasoning and other forms of thought

Carl Zimmer Published 08.07.24, 05:10 AM
istock.com/Bobboz

istock.com/Bobboz

For thousands of years, philosophers have argued about the purpose of language. Plato believed it was essential for thinking. Thought “is a silent inner conversation of the soul with itself”, he wrote.

Many modern scholars have advanced similar views. Starting in the 1960s, Noam Chomsky, a linguist at the Massachusetts Institute of Technology, US, argued that we use language for reasoning and other forms of thought. “If there is a severe deficit of language, there will be a severe deficit of thought,” he wrote.

ADVERTISEMENT

As an undergraduate, Evelina Fedorenko took Chomsky’s class and heard him describe his theory. “I really liked the idea,” she recalled. But she was puzzled by the lack of evidence. “A lot of things he was saying were just stated as if they were facts — the truth.”

Fedorenko went on to become a cognitive neuroscientist at MIT, using brain scanning to investigate how the brain produces language. After 15 years, her research has led her to a startling conclusion: we don’t need language to think.

“When you start evaluating it, you just don’t find support for this role of language in thinking,” she said.

When Fedorenko began this work in 2009, studies had found that the same brain regions required for language were also active when people reasoned or carried out arithmetic. But Fedorenko and other researchers discovered this overlap was a mirage. Part of the trouble with the early results was that the scanners were relatively crude. Scientists made the most of their fuzzy scans by combining the results from all their volunteers, creating an overall average of brain activity.

In her own research, Fedorenko used more powerful scanners and ran more tests on each volunteer. Those steps allowed her and her colleagues to gather enough data to create a fine-grained picture of an individual brain.

The scientists then ran studies to pinpoint brain circuits that were involved in language tasks, such as retrieving words from memory and following rules of grammar. In a typical experiment, volunteers read gibberish, followed by real sentences. The scientists discovered certain brain regions that became active only when volunteers processed actual language.

Each volunteer had a language network — a constellation of regions that become active during language tasks.

“It’s very stable,” Fedorenko said. “If I scan you today, and 10 or 15 years later, it’s going to be in the same place.”

The researchers then scanned the same people as they performed different kinds of thinking, such as solving a puzzle.

“Other regions in the brain are working really hard when you’re doing all these forms of thinking,” she said. But the language networks stayed quiet. “It became clear that none of those things seem to engage language circuits.”

In a paper published in Nature, Fedorenko and her colleagues argued that studies of people with brain injuries point to the same conclusion.

Strokes and other forms of brain damage can wipe out the language network, leaving people struggling to process words and grammar, a condition known as aphasia. But scientists have discovered that people can do algebra and play chess even with aphasia. In experiments, people with aphasia can look at two numbers — 123 and 321, say — and recognise that, by using the same pattern, 456 should be followed by 654.

If language is not essential for thought, then what is language for? Communication, Fedorenko and her colleagues argue.

Chomsky and other researchers have rejected that idea, pointing out the ambiguity of words and the difficulty of expressing our intuitions out loud. “The system is not well designed in many functional respects,” Chomsky once said.

But large studies have suggested that languages have been optimised to transfer information clearly and efficiently.

In one study, researchers found that frequently used words are shorter, making languages easier to learn and speeding the flow of information. In another study, researchers who investigated 37 languages found that the rules of grammar put words close to each other so that their combined meaning is easier to understand.

Kyle Mahowald, a linguist at the University of Texas at Austin, US, who was not involved in the work, said separating thought and language could help explain why artificial intelligence systems like ChatGPT are so good at some tasks and so bad at others.

Computer scientists train these programs on vast amounts of text, uncovering rules about how words are connected. Mahowald suspects that these programs are starting to mimic the language network in the human brain — but falling short on reasoning.

“It’s possible to have very fluent grammatical text that may or may not have coherent underlying thought,” Mahowald said.

But Guy Dove, a philosopher at the University of Louisville, US, thought that Fedorenko and her colleagues were going too far in banishing language from thought — especially complex thoughts.

“When we’re thinking about democracy, we might rehearse conversations about democracy,” he said. “You do not need language to have thoughts, but it can be an enhancement.”

NYTNS

Follow us on:
ADVERTISEMENT
ADVERTISEMENT