Volume 4 Issue 2, 2020, pp. 84-86
rudn.tlcjournal.org
Language unlimited: The science behind our most creative power (a review)
Original work by David Adger published by Oxford University Press 2019 Reviewed by Barry Tomalin
Barry Tomalin Glasgow Caledonian University London barrytomalin@aol.com Date of submission: 11.03.2020 | Date of acceptance for publication: 22.05.2020
Recommended citation format: Tomalin, B. (2020). Language unlimited: The science behind our most creative power (a review). Training, Language and Culture, 4(2), 84-86. Doi: 10.22363/2521-442X-2020-4-2-84-86
This is an open access article distributed under the Creative Commons Attribution 4.0 International License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0)
Admired by no less a luminary as Noam Chomsky and by Gretchen McCulloch, author of Because Internet, Language Unlimited comes highly recommended. The author, David Adger, is Professor of Linguistics at Queen Mary University in London, President of the Linguistics Association of Great Britain and a language inventor. Yes, he invented the monsters& language used in an ITV (UK Independent Television) series Beowulf.
Language Unlimited, subtitled, The Science Behind Our Most Creative Power explores the human but also animals& experience of understanding and using language. The aim of the book is to explore where human language creativity comes from and what is its scope and its limits. Based on insights into neuroscience, linguistics and psychology, the book has ten chapters with notes, acknowledgements and an index. Written in a very readable style the book uses examples from multiple languages and means of communication to show how the acquisition of language is a basic process in the brain (universal grammar) and is interpreted in different ways and at different levels by different species and by different types of user.
Broadly speaking, Adger supports the key ideas of Noam Chomsky writing in the 1950s about universal grammar. Using numerous examples of research into child language acquisition he demonstrates how children in their first years of life are able to pick up language and have an internal brain function that enables them to search for and structure the language they hear from their parents and those around them. This skill which Chomsky defined as universal grammar is based in brain functions located in the Inferior Frontal Gyrus. Research in the US by David Poeppel showed how the brain has what Adger describes as a kind of pulse, called brain rhythms, which can sense language structure as well as bodily functions such as breathing. This justifies the linguist Otto Jesper-sen&s observation that children can interpret from words and sentences heard and understood and construct their own sentences. In other words, there exists a notion of structure that guides us when we form and understand sentences.
How then do children convert what they hear around them into syntax, the grammatical structure of words and phrases in a language, and how
© Barry Tomalin 2020
This content is licensed under a Creative Commons Attribution 4.0 International License
Language unlimited: The science behind our most creative power (a review) Original work by David Adger , reviewed by Barry Tomalin
do we learn it? We have already seen how Adger describes structures in the brain that look for order in the things we hear around us as very young children, a brain facility that Chomsky described as universal grammar. Adger also discusses an alternative theory of language acquisition developed by the constructionists. Constructionists argue that young children that young children instinctively search for order in what they hear using two key skills, chunking and analogy. A chunk is a way of grouping items that are in your memory. The more you experience it the stronger the chunk. So, repetition is essential to the process of chunking. Analogy, on the other hand, is the process of recognising similarities between chunks and in the process building a syntax of acceptable formulations. Constructionists believe that the process of chunking and analogy is not limited to language but actually operates as part of our general mental processes. Adger contends that although chunking and analogy are useful concepts they fail to represent the full depth and variety of language.
For Adger the key understanding of how language knowledge and ability evolves is the concept of &Merge&, put forward by Chomsky in the early 1990s. Chomsky has argued that &Merge& underlies the syntax of all human languages. &Merge& allows us to take similar words and structures and combine them to produce correct sentences. Chomsky&s assertion was that &Merge& combined with language-specific knowledge that children learn could capture the syntax of all human languages. In an article published in 2002 Chomsky argued that the concept of &Merge& is what distinguishes human from animal communication. As he commented, &Merge&, is what allows creativity in communication. It is how we use language in order to produce new phrases and sentences but in a recognised syntactic structure.
What about sign language, the gestures made by deaf children and adults to communicate? Adger cites research to show that profoundly deaf children born to hearing parents who do not know sign language develop their own language of communication, known as homesign. Homesigning, just like spoken language, has a communicative
&Adger cites research to show that profoundly deaf children born to hearing parents who do not know sign language develop their own language of communication, known as homesign. Homesigning, just like spoken language, has a communicative structure, and homesigners create signs from their minds and from what they see and touch&
structure, and homesigners create signs from their minds and from what they see and touch. Citing research by Goldin-Meadow in Chicago and Havi-land in Mexico Adger shows how deaf children find their own ways of communicating with their parents and those around them using their own homesigning language. Most amazingly, in Nicaragua&s capital Managua, under a policy known as ISN (Idioma de senas de Nicaragua) a school brought together children and teenagers who were deaf and who had developed their own systems of homesign with their families. These children learned how to understand each other&s homesigning and went as far as converging on a linguistic system usable by everyone - a common language, in fact, - and all this while learning to lip read from specialist teachers.
One of the fascinating features of Adger&s analysis is his use of examples from non-human users of language and their ability to recognise language patterns. He cites the example of Kanzi, a Bonobo chimpanzee born and brought up in Georgia, USA. Kanzi, unlike Matata, the mother who adopted him, was an excellent communicator, requesting food or toys or going somewhere. Kanzi used pictures on a board devised by researchers and called lexigrams to communicate his needs. The lead researcher, Susan Savage-Rumbaugh, argued that experiments showed Kanzi&s experiences with language were to some extent comparable with humans although other researchers have contested this. However, there is evidence that animals can recognise words as discrete items. Experiments
rudn.tlcjournal.org
Training, Language and Culture 85
Volume 4 Issue 2, 2020, pp. 84-86
&Adger also explores the importance of what he calls Botlang - the language of artificial intelligence - a subject he began studying as part of his BA degree at Edinburgh University in Scotland in the 1980s. He examines research into how computer programmes like Alexa, Siri and Google Assistant manage to understand what you say, most of the time at any rate, and give you a voiced answer&
with chinchillas showed how they reacted to particular sounds and Adger refers to birds and also rats who can distinguish words by their stress pattern although not where words fit in a speech stream.
In his research Adger also explores the importance of what he calls Botlang - the language of artificial intelligence - a subject he began studying as part of his BA degree at Edinburgh University in Scotland in the 1980s. He examines research into how computer programmes like Alexa, Siri and Google Assistant manage to understand what you say, most of the time at any rate, and give you a voiced answer. How do they do it? The answer is the use of big data called treebanks. Treebanks are the stores of hundreds and thousands of bits of voice data obtained from human interactions on the Internet. For example, when you ask Alexa to give you an alarm call for 8.30 in the morning or to play you the music of a particular singer or tune
rudn.tlcjournal.org
into a particular radio station, the computer programme analyses the sound waves of the request and segments it into words to give the answer, or say it doesn&t understand the question in seconds. This is known as hard AI (Artificial Intelligence). It doesn&t try to understand how humans speak. It just looks up huge data sets drawn from the Internet and gives you the answer. The problem is that although the answers to fairly routine questions are regularly and rather impressively answered, AI doesn&t have syntax - it can select sentences but can&t create them. That is why more unusual requests can leave it lost - the answer isn&t in the database. As Adger concludes, AI has a sense of structure but it isn&t a human sense. It sees visible sequences and reproduces them. Humans sense invisible structures and create sentences.
In summary, a useful and quite challenging book for linguists and students of linguistics, fascinatingly written and using research from around the world to show how different language groups, people with different types of speech impairment and animals all search for meaning and how to communicate it in their own way. He agrees with Chomsky that the search for meaning and the way to communicate it is &hardwired& into the brain -what he calls a universal grammar. Above all he asserts the value of Chomsky&s concept of &Merge&. There is, he argues, one human language and we all speak dialects of it. We use language differently but the building blocks are ultumately the same. &Merge&, therefore, is universal to humanity. &Merge&, he agrees with Chomsky, underlies the syntax of all human languages.