Language

How language changes the way you hear music

In a new paper I, together with Roel Willems and Peter Hagoort, show that music and language are tightly coupled in the brain. Get the gist in a 180 second youtube clip and then try out what my participants did.

The task my participants had to do might sound very abstract to you, so let me make it concrete. Listen to these two music pieces and tell me which one sounds ‘finished’:

I bet you thought the second one ended a bit in an odd way. How do you know? You use your implicit knowledge of harmonic relations in Western music for such a ‘finished judgement’. All we did in the paper was to see whether an aspect of language grammar (syntax) can influence your ability to hear these harmonic relations, as revealed by ‘finished judgements’. The music pieces we used for this sounded very similar to what you just heard:

It turns out that reading syntactically difficult sentences while hearing the music reduced the feeling that music pieces like this did actually end well. This indicated that processing language syntax draws on brain resources which are also responsible for music harmony.

Difficult syntax: The surgeon consoled the man and the woman put her hand on his forehead.

Easy syntax: The surgeon consoled the man and the woman because the operation had not been successful.

Curiously, sentences with a difficult meaning had no influence on the ‘finished judgements’.

Difficult meaning: The programmer let his mouse run around on the table after he had fed it.

Easy meaning: The programmer let his field mouse run around on the table after he had fed it.

Because only language syntax influenced ‘finished judgements’, we believe that music and language share a common syntax processor of some kind. This conclusion is in line with a number of other studies which I blogged about before.

What this paper adds is that we rule out an attentional link between music and language as the source of the effect. In other words, difficult syntax doesn’t simply distract you and thereby disables your music hearing. Its influence is based on a common syntax processor instead.

In the end, I tested 278 participants across 3 pre-tests, 2 experiments, and 1 post-test. Judge for yourself whether it was worth it by reading the freely available paper here.

— — —

Kunert R, & Slevc LR (2015). A Commentary on: “Neural overlap in processing music and speech”. Frontiers in human neuroscience, 9 PMID: 26089792

Kunert, R., Willems, R., & Hagoort, P. (2016). Language influences music harmony perception: effects of shared syntactic integration resources beyond attention Royal Society Open Science, 3 (2) DOI: 10.1098/rsos.150685

Do music and language share brain resources?

When you listen to some music and when you read a book, does your brain use the same resources? This question goes to the heart of how the brain is organised – does it make a difference between cognitive domains like music and language? In a new commentary I highlight a successful approach which helps to answer this question.

On some isolated island in academia, the tree of knowledge has the form of a brain.

How do we read? What is the brain doing in this picture?

When reading the following sentence, check carefully when you are surprised at what you are reading:

After | the trial | the attorney | advised | the defendant | was | likely | to commit | more crimes.

I bet it was on the segment was. You probably thought that the defendant was advised, rather than that someone else was advised about the defendant. Once you read the word was you need to reinterpret what you have just read. In 2009 Bob Slevc and colleagues found out that background music can change your reading of this kind of sentences. If you hear a chord which is harmonically unexpected, you have even more trouble with the reinterpretation of the sentence on reading was.

Why does music influence language?

Why would an unexpected chord be problematic for reading surprising sentences? The most straight-forward explanation is that unexpected chords are odd. So they draw your attention. To test this simple explanation, Slevc tried out an unexpected instrument playing the chord in a harmonically expected way. No effect on reading. Apparently, not just any odd chord changes your reading. The musical oddity has to stem from the harmony of the chord. Why this is the case, is a matter of debate between scientists. What this experiment makes clear though, is that music can influence language via shared resources which have something to do with harmony processing.

Why ignore the fact that music influences language?

None of this was mention in a recent review by Isabelle Peretz and colleagues on this topic. They looked at where in the brain music and language show activations, as revealed in MRI brain scanners. This is just one way to find out whether music and language share brain resources. They concluded that ‘the question of overlap between music and speech processing must still be considered as an open question’. Peretz call for ‘converging evidence from several methodologies’ but fail to mention the evidence from non-MRI methodologies.1

Sure one has to focus on something, but it annoys me that people tend focus on methods (especially fancy expensive methods like MRI scanners), rather than answers (especially answers from elegant but cheap research into human behaviour like reading). So I decided to write a commentary together with Bob Slevc. We list no less than ten studies which used a similar approach to the one outlined above. Why ignore these results?

If only Peretz and colleagues had truly looked at ‘converging evidence from several methodologies’. They would have asked themselves why music sometimes influences language and why it sometimes does not. The debate is in full swing and already beyond the previous question of whether music and language share brain resources. Instead, researchers ask what kind of resources are shared.

So, yes, music and language appear to share some brain resources. Perhaps this is not easily visible in MRI brain scanners. Looking at how people read with chord sequences played in the background is how one can show this.

— — —
Kunert, R., & Slevc, L.R. (2015). A commentary on “Neural overlap in processing music and speech” (Peretz et al., 2015) Frontiers in Human Neuroscience : doi: 10.3389/fnhum.2015.00330

Peretz I, Vuvan D, Lagrois MÉ, & Armony JL (2015). Neural overlap in processing music and speech. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 370 (1664) PMID: 25646513

Slevc LR, Rosenberg JC, & Patel AD (2009). Making psycholinguistics musical: self-paced reading time evidence for shared processing of linguistic and musical syntax. Psychonomic bulletin & review, 16 (2), 374-81 PMID: 19293110
— — —

1 Except for one ECoG study.

DISCLAIMER: The views expressed in this blog post are not necessarily shared by Bob Slevc.

Everything you always wanted to know about language but were too afraid to ask

MPI Nijmegen

The MPI in Nijmegen: the origin of answers to your questions.

The Max Planck Institute in Nijmegen has started a great initiative which tries nothing less than answer all your questions about language. How does it work?

1) Go to this website: http://www.mpi.nl/q-a/questions-and-answers
2) See whether your question has already been answered
3) If not, scroll to the bottom and ask a question yourself.
The answers are not provided by just anybody but by language researchers themselves. Before they are put on the web they get checked by another researcher and they get translated into German, Dutch and English. It’s a huge enterprise, to be sure..
As an employee of the Max Planck Institute I’ve had my own go at answering a few questions:
How does manipulating through language work?
Is it true that people who are good in music can learn a language sooner?
How do gender articles affect cognition?
.What do think of my answers? What questions would you like to see answered?

 

——————————————————————————————————————–

Thibodeau, P., & Boroditsky, L. (2011). Metaphors We Think With: The Role of Metaphor in Reasoning PLoS ONE, 6 (2) DOI: 10.1371/journal.pone.0016782

Asaridou, S., & McQueen, J. (2013). Speech and music shape the listening brain: evidence for shared domain-general mechanisms Frontiers in Psychology, 4 DOI: 10.3389/fpsyg.2013.00321

Segel, E., & Boroditsky, L. (2011). Grammar in Art Frontiers in Psychology, 1 DOI: 10.3389/fpsyg.2010.00244

 

ResearchBlogging.org

Are some languages easier than others?

‘Long time no see’ is something I heard repeatedly in Britain even though it totally violates all the English grammar I learned at school. Clearly, Brits should correct this expression originating from Chinese Pidgin English rather than adopt it. The reason it entered common usage anyway is at the heart of why you might find English a lot easier to learn than the other British languages like Welsh or Gaelic. In a nutshell: when you learn English, it learns something from you as well.

Three years ago Gary Lupyan and Rick Dale published a (freely available) paper in which they looked at over 2,000 languages across the globe and quantified how difficult they are, e.g. by looking at their morphological complexity. Morphological complexity refers to how difficult it is to say a word in its correct form (‘went’ rather than ‘go-ed’). Its simpler counterpart is usually the use of more words to say the same thing (compare the sometimes irregular past like ‘gone’ with the always regular future ‘will go’). Using these principles Lupyan and Dale could show that languages which are spoken by more people tend to be simpler. Why?
 .
When languages grow big, they tend to get simple.
When languages grow big, they tend to get simple.
 .
Lupyan and Dale hypothesise that languages with more speakers also include more people who learned it when they were no longer children. As an adult, when you are not that good at learning a language anymore, you make yourself understood without speaking perfectly. Over time, these mistakes and simplifications are adopted by the language simply because difficult things never get learned by a new generation of learners. They are just forgotten. In some sense, the language learns what it can expect from its learners and what not. This drive towards simplification is a lot less strong when only expert language learners, i.e. children, are responsible for language transmission.
This year, a new study got published which directly looked at the proportion of adult second language learners in a given community rather than just assume it from the community size, as Lupyan and Dale did. Christian Bentz and Bodo Winter looked at case marking which is another pain to learn. In many languages around the world the Who does What to Whom pattern is not expressed through word order, like in English, but instead through case marking on words (similar to difference in roles marked by ‘he – him – his’). It turns out that on average languages which managed to retain a case system only have 16% of its speakers learn it after childhood, while the comparable number for no-case languages is 44%. Adults are bad at learning grammatical case systems, so it is forgotten if many adult learners speak the language.

.

Melting Pot, English, Foreign Language, L2

His forebearers shaped English. As does he.

 .
So, yes, some languages are indeed easier. Learning them is a lot simpler. The reason being that language is not an invention of a single person. Instead, it is a communication tool shaped by the people using it. When Chinese people started using English they made many mistakes, some of them got adopted like ‘Long time no see’. Notice how it uses very little morphology, i.e. the words are all like you would find them in a dictionary, and no case at all (by that time English no longer had a full case system).
Follow the path of other adult language learners and you will meet with less resistance.
———————————————–

Bentz C, & Winter B (2013). Languages with more second language learners tend to lose case Language Dynamics and Change, in press

Lupyan G, & Dale R (2010). Language structure is partly determined by social structure. PloS one, 5 (1) PMID: 20098492

———————————————————

Figures:
1) adapted from Lupyan & Dale, 2010, p. 7
2) By Eneas De Troya from Mexico City, México (Melting Pot  Uploaded by russavia) [CC-BY-2.0 (http://creativecommons.org/licenses/by/2.0)%5D, via Wikimedia Commons

ResearchBlogging.org

Is it safe to talk while driving? – Partly depends on what you talk about.

World Health Organization reports about road safety are mind boggling: about 1.2 million people die on the world’s roads every year. For people of my age (15 to 29 year olds) it is the leading cause of death.

A rather recent addition to laws designed to reduce these numbers was the adoption of compulsory hands-free devices for mobile phones. Their safety value is easy to understand. When you look at a mobile phone display you cannot simultaneously look at the road. Similarly, using your hands for typing and using them for steering are at least partly incompatible actions.
mobile phone use while driving

How mobile phone use impairs sight and hands.

From a psychological point of view the current law tries to ensure that visual input channels (eyes) and motor output channels (hands) remain undisturbed. But what about the brain areas which control these channels?
This is the question recently investigated by Bergen from UC San Diego and colleagues. They put undergraduates in a driving simulator giving the impression of a motorway with steady traffic and a car in front of the driver breaking from time to time. Simultaneously, the driver had to judge simple true/false statements from the motor domain (e.g., “To open a jar, you turn the lid counterclockwise.”), the visual domain (e.g., “The letters on a stop sign are white.”), or the abstract domain (e.g., “The capital of North Dakota is Bismarck.”). As a baseline condition, people were just asked to say “true” or “false” several times.
Why choose such questions? There is both behavioural and brain-imaging evidence that language comprehension involves the simulation of what was said. This set of findings is often summarised as embodied cognition and its take-home message is something like this: in order to understand it, you mentally do it. For example, to answer a motor question, you use your brain areas doing motor control and make them simulate what it would be like to open a jar. Based on the outcome of this simulation you answer the question.
So, will visual or motor questions affect driving differently than abstract questions because the former engage the same brain areas as those needed for driving while the latter don’t? The alternative would be that asking anything distracts because general attention gets pulled away from driving.
The results go both ways. First, one measure was affected by the true/false statements but not by which kinds: quickly breaking when the car in front breaks. The time it took to do so was longer if any sort of question was asked compared to baseline. This suggests that domain general mechanisms were interfered with through language, e.g., attention.
Liza minelli driving

Was she a safe driver? May depend on whether she talked and if so about what.

Second, one measure was affected by what kind of statements had to be judged:generally holding a safe distance to other cars. This distance was greater if visual questions were asked compared to abstract questions and compared to baseline. A similar, albeit not as clear, pattern emerged for motor questions. It looks as if participants were so distracted by these kinds of questions that they fell behind their optimal driving distance. This suggests that a task such as keeping a safe driving distance which requires visual working memory (compare ideal distance to actual distance) and corrective motor responses (bring ideal and actual distances closer together) is influenced by language comprehension through mental simulation.
On the one hand, the scientific implications are quite straight forward. Bergen and colleague’s results suggest that those low level perception and action control areas which are needed for quick reactions are not what embodied cognition is about. Instead it seems like embodied cognition happens in higher perceptual and motor planning areas. Furthermore, the whole embodied cognition idea gets quite a boost from a conceptual replication under relatively realistic conditions.
On the other hand, the practical implications are somewhat controversial. Because talking in general impairs quick reactions by the driver, even hands-free devices pose a risk. This danger is compounded by talking about abstract topics since the driving distance is reduced compared to visual topics.
The authors refrain from saying that any sort of conversation should be prohibited. Passengers share perceptual experiences with the driver and can adjust their conversations to the dangerousness of the situation. Mobile phone contacts can’t do this. But what if you want to be really really safe? Well, cut your own risk of dying and take public transport. There you can chat and cut your death risk by 90% (bus) or even 95% (train or flight) compared to car travel (EU numbers).
London bus

A safe way to travel.

———————————————-

Bergen, B., Medeiros-Ward, N., Wheeler, K., Drews, F., & Strayer, D. (2012). The Crosstalk Hypothesis: Why Language Interferes With Driving. Journal of experimental psychology. General PMID: 22612769

———————————————

images:

1) By Ed Brown as Edbrown05 (Own work) [CC-BY-SA-2.5 (www.creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons

2) By Alan Light (Flickr) [CC-BY-2.0 (http://creativecommons.org/licenses/by/2.0)%5D, via Wikimedia Commons

3) By Original author was User:Kameragrl at Wikitravel Shared, transferred to Commons by User:Oxyman (http://wikitravel.org/shared/Image:London_Bus.jpg) [CC-BY-SA-1.0 (http://creativecommons.org/licenses/by-sa/1.0)%5D, via Wikimedia Commons

ResearchBlogging.org

Gendered Language, Gendered Mind

What is so female about ships to call them she (LINK)? What is so neuter about children to call them it (LINK)? Now imagine that entire languages – like German, Spanish and French – are full of these arbitrary gender assignments, not allowing any genderless nouns. This has a profound effect on the way the mind works. A couple of articles published last year on the grammatical gender of nouns in different languages nicely illustrate this point.
To native speakers of gendered languages – i.e. languages whose nouns are all masculine, feminine or perhaps neuter – their language’s gender system usually appears obvious. I vividly remember sitting in France in a Philosophy class and the teacher elaborated on the female gender of life (la vie). According to her, life could only ever be feminine for some forgotten reason. When a class mate pointed out that life was neuter in German and, that, therefore, her reasoning was flawed she turned to me as a native German speaker. I could only agree with the comment and see her theory fall apart in real time (btw, life can even be masculine as for example in Bulgarian or Hebrew). This is the first experience which I can remember of a native speaker applying the mostly arbitrary grammatical gender system beyond the domain of language.
Recent research has found more examples of grammatical gender influencing how language users think about completely asexual things. In a very small experiment, an Israeli friend of mine (Rony Halevy) asked Hebrew speakers to dress up cutlery and other objects and found more feminine dresses on grammatically female items and vice versa for male items (see picture). Dutch controls, who do not distinguish between male and female grammatical gender, did not show a similar effect. Still, one may argue that the reference to gender was in the task already. Similarly, language based tasks in this field could be said to only reveal an effect of grammatical gender on other linguistic processes. So, can language really influence the mind in general?
Rony Halevy
Hebrew is a gendered language and participants tend to dress up simple objects such as a spoon or a fork according to their grammatical gender. The Dutch gender system does not refer to male and female and does not show the same effect. Data based on student project by Rony Halevy.
Cubelli et al. (2011) used a categorisation task in which participants had to quickly judge whether two pictures showed objects belonging to the same category or not. Judgements were faster if the objects’ grammatical gender matched. The authors interpreted this as showing that people access the words related to the pictures even when this is not required for the task.
Even outside the laboratory the effect can be shown. Sampling images from a big online art database, Segel and Boroditsky (2011) looked at all the gendered depictions of naturally asexual entities like love, justice, or time. Depicted gender agreed with grammatical gender in 78% of the cases. The effect was replicable for Italian, French and German. On top of that, it even held when only looking at those entities whose grammatical genders are conflicting in the studied languages.
It is worth reiterating that the aforementioned behaviours were completely non-linguistic. The grammatical gender system is just a set of rules for how words change when combined. The fact that people draw on these purely linguistic rules to perform unrelated tasks shows quite powerfully what a central role language plays in our minds.
But the effect may go further than that. In English, natural gender must be included in personal pronouns (he/she). Admittedly, there are exceptions (child – it) but they are rare. In Chinese, there is no such requirement. Personal pronouns can mark gender (written forms of ta) or not (spoken ta). Chen and Su (2011, Experiment 2) presented English or Chinese participants with written English or Chinese sentences which included gendered personal pronouns. Participants were asked to match each sentence to one of two pictures, each showing a person of a different gender. English speaking participants were faster and more accurate than Chinese speakers on these judgements. It’s as if English speakers are better trained in thinking about natural gender because English makes such thinking compulsory. Chinese participants, on the other hand, can produce pronouns without thinking of natural gender and, thus, have this information less readily available for their judgements.
One may argue that the effect relies on people of different native tongues showing different behaviours. These people probably differ in many ways other than their native language. Wider cultural differences could be invoked. Still, given that the effect holds for German, French, Italian, Spanish and Chinese, the most straightforward explanation indeed appears to be their language background. A way of overcoming the confounding influence of cultural upbringing may be to contrast second language learners of the same native language who learn different second languages.
Despite these short comings, the influence of the gender status of a language on the mind of its users is clearly measurable. This illustrates quite nicely that thought is influenced by what you must say – rather than by what you can say. This highlights that language is not an isolated skill but instead a central part of how our minds function. Studying language use is important – not just for the sake of language.
Chen, J-Y., & Su, J-J. (2011). Differential Sensitivity to the Gender of a Person by English and Chinese Speakers. Journal of Psycholinguist Research, 40, 195–203. doi: 10.1007/s10936-010-9164-9
Cubelli, R., Paolieri, D., Lotto, L., & Job, R. (2011). The Effect of Grammatical Gender on Object Categorization. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 449–460. doi: 10.1037/a0021965
Segel, E., & Boroditsky, L. (2011). Grammar in art. Frontiers in Psychology, 1,1. doi: 10.3389/fpsyg.2010.00244