age

The real reason why new pop music is so incredibly bad

You have probably heard that Pink Floyd recently published their new album Endless River. Will this bring back the wonderful world of good music after the endless awfulness of the popular music scene in the last 20 years or so? Is good music, as we know it from the 60s and 70s, back for good? The reasons behind the alleged endless awfulness of pop music these days suggest otherwise. We shouldn’t be throwing stones at new music but instead at our inability to like it.

Pink Floyd 1973

When we were young we learned to appreciate Pink Floyd.

Daniel Levitin was asked at a recent music psychology conference in Toronto why old music is amazing and new music is awful. He believed that modern record companies are there to make money. In the olden days, on the other hand, they were there to make music and ready to hold on to musicians which needed time to become successful. More interestingly, he reminded the public that many modern kidz would totally disagree with the implication that modern music is awful. How can it be that new music is liked by young people if so much of it is often regarded as quite bad?

Everything changes for the better after a few repetitions

The answer to the mystery has nothing to do with flaws in modern music but instead with our brain. When adults hear new music they often hate it at first. After repeated listening they tend to find it more and more beautiful. For example, Marcia Johnson and colleagues (1985) played Korean melodies to American participants and found that hearing a new melody led to low liking ratings, a melody heard once before to higher ratings and even more exposure to higher than higher ratings. Even Korsakoff patients – who could hardly remember having heard individual melodies before – showed this effect, i.e. without them realising it they probably never forget melodies.

This so-called mere exposure effect is all that matters to me: a robust, medium-strong, generally applicable, evolutionarily plausible effect (Bornstein, 1989). You can do what you like, it applies to all sorts of stimuli. However, there is one interesting exception here. Young people do not show the mere exposure effect, no relationship between ‘repeat the stimulus’ and ‘give good feeling’ (Bornstein, 1989). As a result, adults need a lot more patience before they like a new song as much as young people do. No wonder adults are only satisfied with the songs they already know from their youth in the 60s and 70s. Probably, when looking at the music scene in 2050 the current generation will equally hate it and wish the Spice Girls back (notice the gradual rise of 90’s parties already).

I listened to it –> I like it

So, when it comes to an allegedly awful present and great past, ask yourself: how deep is your love for the old music itself rather than its repeated listening? Listen repeatedly to any of a million love songs and you will end up appreciating it. Personally, I give new music a chance and sometimes it manages to relight my fire. Concerning Endless River, if it’s not love at first sight, do not worry. The new Pink Floyd album sure is good (depending on how many times you listen to it).

— — —
Bornstein, R. (1989). Exposure and affect: Overview and meta-analysis of research, 1968-1987. Psychological Bulletin, 106 (2), 265-289 DOI: 10.1037/0033-2909.106.2.265

Johnson MK, Kim JK, & Risse G (1985). Do alcoholic Korsakoff’s syndrome patients acquire affective reactions? Journal of experimental psychology. Learning, memory, and cognition, 11 (1), 22-36 PMID: 3156951
— — —

Figure: By PinkFloyd1973.jpg: TimDuncan derivative work: Mr. Frank (PinkFloyd1973.jpg) [CC-BY-3.0 (http://creativecommons.org/licenses/by/3.0)%5D, via Wikimedia Commons

— — —

PS: Yes, I did hide 29 Take That song titles in this blog post. Be careful, you might like 90’s pop music a little bit more due to this exposure.

 

 

 

 

 

Advertisements

Old people are immune against the cocktail party effect

Imagine standing at a cocktail party and somewhere your name gets mentioned. Your attention is immediately grabbed by the sound of your name. It is a classic psychological effect with a new twist: old people are immune.

Someone mention my name?

The so-called cocktail party effect has fascinated researchers for a long time. Even though you do not consciously listen to a conversation around you, your own name can grab your attention. That means that unbeknownst to you, you follow the conversations around you. You check them for salient information like your name, and if it occurs you quickly switch attention to where your name was mentioned.

The cocktail party simulated in the lab

In the lab this is investigated slightly differently. Participants listen to one ear and, for example, repeat whatever they hear. Their name is embedded in what they hear coming in to the other (unattended) ear. After the experiment one simply asks ‘Did you hear your own name?’ In a recent paper published by Moshe Naveh-Benjamin and colleagues (in press), around half of the young student participants noticed their name in such a set-up. Compare this to old people aged around 70: next to nobody (only six out of 76 participants) noticed their name being mentioned in the unattended ear.

Why this age difference? Do old people simply not hear well? Unlikely, when the name was played to the ear that they attended to, 45% of old people noticed their names. Clearly, many old people can hear their names, but they do not notice their names if they do not pay attention to this. Young people do not show such a sharp distinction. Half the time they notice their names, even when concentrating on something else.

Focusing the little attention that is available

Naveh-Benjamin and colleagues instead suggest that old people simply have less attention. When they focus on a conversation, they give it their everything. Nothing is left for the kind of unconscious checking of conversations which young people can do so well.

At the next cocktail party you can safely gossip about your old boss. Just avoid mentioning the name of the young new colleague who just started.

 

— — —

Naveh-Benjamin M, Kilb A, Maddox GB, Thomas J, Fine HC, Chen T, & Cowan N (2014). Older adults do not notice their names: A new twist to a classic attention task. Journal of experimental psychology. Learning, memory, and cognition PMID: 24820668

— — —

Picture:

By Financial Times (Patrón cocktail bar) [CC-BY-2.0 (http://creativecommons.org/licenses/by/2.0)%5D, via Wikimedia Commons

ResearchBlogging.org

How to increase children’s patience in 5 seconds

A single act increases adults’ compliance with researchers. The same act makes students more likely to volunteer to solve math problems in front of others. Moreover, it makes four-year-olds more patient. What sounds like a miracle cure to everyday problems is actually the oldest trick in the book: human touch.

How do researchers know this? Here is one experiment. In a recently published study (Leonard et al., 2014), four and five year old children were asked to wait for ten minutes in front of candy. The experimenter told them to wait before eating the candy because he had to finish paperwork. How long would children wait before calling the experimenter in because they wanted to eat the candy earlier? Four-year-olds waited for about six minutes while five-year-olds waited for about eight minutes. The task was similar to the classic Marshmallow test shown in the video.

 

The positive effect of touch

However, it all depends on whether the experimenter gave children a friendly touch on the back during the request to wait. If she did, four-year-olds waited for seven minutes (versus 5 minutes without touch) and five-year-olds waited for nine minutes (versus seven minutes without touch). A simple, five-second-long touch made four-year-olds behave as patiently as five-year-olds. It’s surprising how simple and fast the intervention is.

Touch across the ages

This result nicely fits into a wider literature on the benefits of a friendly touch. Already back in the eighties Patterson and colleagues (1986) found that adults spent more time helping with the tedious task of scoring personality tests if they were touched by the experimenter. Interestingly, the touch on the shoulder was hardly ever reported as noteworthy. In the early noughties Gueguen picked this effect up and moved it to the real world. He showed that touch also increases adults’ willingness to help by watching after a large dog (Gueguen & Fisher-Loku, 2002) as well as students’ willingness to volunteer to solve a math problem in front of a class (Gueguen, 2004).

The reason underlying these effects remains a bit mysterious. Does the touch on the back reduce the anxiety of being faced with a new, possibly difficult, task? Does it increase the rapport between experimenter and experimental participant? Does it make time fly by because being touched feels good? Well, time will tell.

Touch your child?

There are obvious sexual connotations related to touching people, unfortunately this includes touching children. As a result, some schools in the UK have adopted a ‘no touch’ policy: teachers are never allowed to touch children. Research shows that such an approach comes at a cost: children behave less patiently when they are not touched. Should society deny itself the benefits of people innocently touching each other?

————————————————————————————————————————–

Guéguen N, & Fischer-Lokou J (2002). An evaluation of touch on a large request: a field setting. Psychological reports, 90 (1), 267-9 PMID: 11898995

Guéguen, N. (2004). Nonverbal Encouragement of Participation in a Course: the Effect of Touching Social Psychology of Education, 7 (1), 89-98 DOI: 10.1023/B:SPOE.0000010691.30834.14

Leonard JA, Berkowitz T, & Shusterman A (2014). The effect of friendly touch on delay-of-gratification in preschool children. Quarterly journal of experimental psychology (2006), 1-11 PMID: 24666195

Patterson, M., Powell, J., & Lenihan, M. (1986). Touch, compliance, and interpersonal affect Journal of Nonverbal Behavior, 10 (1), 41-50 DOI: 10.1007/BF00987204

ResearchBlogging.org

delaying dementia without pills

‘What’s this? A potato?’ asked my friend’s grandfather during lunch. As always, he used his charming grin and characteristically loud voice. Even though the entire conversation was in Argentine Spanish – which I had learned only a short while before – I understood the oddity of the situation at once. Instead of a potato, the grandfather held a kiwifruit in his hands.
After only a short time of living with this family I noticed that the grandfather no longer had the mental abilities he once must have possessed in order to lead a successful business and raise an adorable family. He was undiagnosed but his behaviour reminded me of Mild Cognitive Impairment, which can progress to a more severe general cognitive impairment – Alzheimer’s Disease or more generally dementia – which usually cannot be cured. ‘What can be done?’ I was asked by my friend’s grandmother afraid of slowly losing the husband she had shared most of her life with. In broken Spanish I tried to explain to her what I would do: build up a cognitive reserve. This concept – related to the beneficial effects of, for example, high education or mentally demanding spare time activities – is perhaps the most promising strategy for delaying dementia.
A large scale analysis illustrates what a cognitive reserve can achieve. First of all, it can delay dementia. An Australian research team (Valenzuela & Sachdev, 2006) collected studies which recruited old people when they were still perfectly healthy and then tested them again after a few years to find out by how much their cognitive abilities had declined. The trend across more than 47,000 people was for higher education and more demanding leisure activities to slow down the creeping loss of mental abilities leading to dementia.

A German nun without experimental confounds.

The savvy reader may already notice a problem with this theory: high education is associated with a generally healthier lifestyle. Rather than cognitive reserve, we should perhaps simply be talking about healthy vs. non-healthy life styles. A Bavarian study ruled this problem out (Bickel & Kurz, 2009). They gained access to the education and dementia records of older female members of a religious order who lived as similarly as one can imagine. The 442 participants had shared a roof for more than five decades, shared meals together, had the same access to medical care. None smoked. None had any personal items. And still, 39% of sisters with low education suffered from dementia, compared to only 14% in the remaining group. Clearly, whether life style has an effect or not, the benefits of a cognitive reserve cannot be reduced to it. It delays dementia all by itself.
This beneficial effect of a cognitive reserve led me to give my advice. However, this strategy cannot stave off dementia forever or even slow it down once it kicks in. Nikolaos Scarmeas and colleagues from Columbia University (2006) found that more highly educated New Yorkers above 65 lose their memory faster around the time of an Alzheimer’s disease diagnosis compared to less educated city dwellers. Apparently, the benefits of a high education are absent around the time of diagnosis.
This raises the obvious question whether my advice was perhaps too late. Once on the road to Alzheimer’s there may be no turning back and efforts to delay the inevitable could make things worse. Given what we know about how the cognitive reserve actually works, I do not believe that this is true. First of all, a cognitive reserve is no cure against dementia but merely a way to delay it. The theory goes that brain pathology progresses whether you have a cognitive reserve or not. What a high education level and demanding leisure activities actually do is to avoid the usual outcome of brain pathology – e.g., easily noticeable memory problems of the kind I have described above. This is supported by studies which compared the brains of people with equal mental function in high age. Those with higher education have more amyloid deposits – a peptide associated with Alzheimer’s disease – as if they were able to deal with their reduced brain function in a better way (Kemppainen et al., 2008; Rentz et al., 2010). At some point though, the progressively worse brain function catches up with you and the resulting cognitive decline is faster.
Charles Hall and colleagues (2007; 2009) tested this overall model in the real world. His analyses of memory test scores of over 100 Bronx residents over the years shows the predicted trend. At first, a high cognitive reserve – whether education or leisure activities – delays the point in time when mental abilities suddenly decline rapidly. Each year in education delays this moment by two and a half months. Each day of mentally stimulating leisure activities delays it by two months. Once this moment is reached, though, the decline is faster with a higher cognitive reserve – as if the aforementioned brain pathology catches up. A cognitive reserve helps you to delay dementia but not to escape it.
cognitive decline; Alzheimer's disease; old age; dementia; cognitive reserve

The higher the education the shallower the decline before a break point, the later that break point, and the steeper the decline thereafter.

My friend’s grandfather had long been out of education. But the second source of a cognitive reserve – mentally demanding leisure activities – was not beyond him. What sort of activities work? A French research team led by Tasnime Akbaraly (2009) took a better look. They found that only a certain kind of leisure activity will delay dementia onset. Watching television and other passive behaviours won’t do. Neither do physical activities like going for a walk. Nor social ones like have friends or family over. The crucial set of activities are the mentally demanding ones: doing crosswords, playing cards, attending organisations, going to the cinema/theatre, practicing an artistic activity etc.
It is a mystery to me why this knowledge is not more widely spread. Dementia is one of the central challenges facing an ageing population as well as many old couples individually. Research shows that one does not need to be a passive spectator of mental decline. If a cognitive reserve has been built up, one can enjoy more years without showing signs of an incurable disease. That’s what I tried to say in broken Spanish to my friend’s grandmother: make him use his mind.

————————————————————————————————————————-

Akbaraly, T., Portet, F., Fustinoni, S., Dartigues, J., Artero, S., Rouaud, O., Touchon, J., Ritchie, K., & Berr, C. (2009). Leisure activities and the risk of dementia in the elderly: Results from the Three-City Study Neurology, 73 (11), 854-861 DOI: 10.1212/WNL.0b013e3181b7849b

Bickel H, & Kurz A (2009). Education, occupation, and dementia: the Bavarian school sisters study. Dementia and geriatric cognitive disorders, 27 (6), 548-56 PMID: 19590201

Hall CB, Derby C, LeValley A, Katz MJ, Verghese J, & Lipton RB (2007). Education delays accelerated decline on a memory test in persons who develop dementia. Neurology, 69 (17), 1657-64 PMID: 17954781

Hall CB, Lipton RB, Sliwinski M, Katz MJ, Derby CA, & Verghese J (2009). Cognitive activities delay onset of memory decline in persons who develop dementia. Neurology, 73 (5), 356-61 PMID: 19652139

Kemppainen NM, Aalto S, Karrasch M, Någren K, Savisto N, Oikonen V, Viitanen M, Parkkola R, & Rinne JO (2008). Cognitive reserve hypothesis: Pittsburgh Compound B and fluorodeoxyglucose positron emission tomography in relation to education in mild Alzheimer’s disease. Annals of neurology, 63 (1), 112-8 PMID: 18023012

Rentz DM, Locascio JJ, Becker JA, Moran EK, Eng E, Buckner RL, Sperling RA, & Johnson KA (2010). Cognition, reserve, and amyloid deposition in normal aging. Annals of neurology, 67 (3), 353-64 PMID: 20373347

Scarmeas, N., Albert, S.M., Manly, J.J., & Stern, Y. (2005). Education and rates of cognitive decline in incident Alzheimer’s disease Journal of Neurology, Neurosurgery & Psychiatry, 77 (3), 308-316 DOI: 10.1136/jnnp.2005.072306

Valenzuela MJ, & Sachdev P (2006). Brain reserve and cognitive decline: a non-parametric systematic review. Psychological medicine, 36 (8), 1065-73 PMID: 16650343

————————————————————————————————————————-
ResearchBlogging.org

Figures:

1) By André Karwath aka Aka (Own work) [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)%5D, via Wikimedia Commons

2) By Doris Ulmann, 1882–1934. [Public domain], via Wikimedia Commons

3) Hall et al., 2007, p. 1661

————————————————————————————————————————-

If you were not entirely indifferent to this post, please consider leaving a comment.

Is ADHD different around the globe? The role of research cultures

An illness is an illness wherever you are. Perhaps this is true for organic diseases but the cultural background can play a tremendous role in the progression and even diagnosis of mental disorders (see e.g., David Dobbs recent post at Wired). However, what has been neglected is an appreciation of how culture affects the research underlying the diagnosis and treatment of psychological disorders. As a consequence, our view on the disorder can change.

Attention deficit hyperactivity disorder shows how culture can exert quite some effect on psychiatric research. In a 2007 meta-analysis by Polanczyk and colleagues prevalence rates were found to differ markedly between geographic regions, but not in the way you might expect. As opposed to the myth of ADHD as an American social construct, European and North American ADHD rates were not significantly different. But both were significantly different to the prevalence rates in Africa and the Middle East.
ADHD in school

One case of ADHD. Or perhaps two. Depends where we are.

However, Polanczyk and colleagues state that this is most likely due to different criteria for diagnosis and study inclusion. For example, while the diagnostic system published by the World Health Organisation is quite strict, the one published by the American Psychiatric Association is more liberal. Depending on which one the researchers adopt, the same person could be part of the ADHD group in one study and the control group in another one.
These different inclusion criteria appear to bias international comparisons. The severe restrictions on ADHD diagnosis in Middle Eastern studies can increase the apparent ADHD severity and social problems. Don’t be surprised then if you read that Middle Eastern ADHD kids fare worse in life than their American counterparts.
Beyond different inclusion criteria, the focus of studies can differ by geographic region. In a recent review, Hodgkins et al. (2012) showed that about half of North American and European ADHD studies sampled adults. East Asian researchers, on the other hand, were mainly interested in adolescents and only sampled adults in about a third of studies. Will this result in ADHD as a potentially life long disease in the Western view while the Easten perspective sees it as part of the transition to adulthood? If so, researchers could be partly to blame for this difference.
Finally, what life consequences a ADHD diagnosis entails is differently researched. While East Asians are mainly interested in effects on self-esteem, Europeans focus more on antisocial behaviour. North American researchers, on the other hand, measure drug abuse and addiction outcomes more than their European or East Asian counterparts. A single headline grabbing result could forever associate inattentive kids with drug abuse. Don’t expect this result to emerge in Asia, it is likely to be found in the US.
This is not to say that ADHD, its prevalence in different age groups or its life consequences are entirely determined by research agendas. Evidence is still needed to support diagnosis or treatment. However, whether anyone ever looked for this evidence is dependent on culture. Across the world research cultures, i.e. strategies to get scientific evidence, differ. Don’t be surprised then if evidence based psychiatry differs as well.

———————————————————————————————————–
Hodgkins P, Arnold LE, Shaw M, Caci H, Kahle J, Woods AG, & Young S (2011). A systematic review of global publication trends regarding long-term outcomes of ADHD. Frontiers in psychiatry / Frontiers Research Foundation, 2 PMID: 22279437

Polanczyk G, de Lima MS, Horta BL, Biederman J, & Rohde LA (2007). The worldwide prevalence of ADHD: a systematic review and metaregression analysis. The American journal of psychiatry, 164 (6), 942-8 PMID: 17541055
————————————————————————————————————
ResearchBlogging.org
images:

1) By CDC (http://www.cdc.gov/ncbddd/adhd/facts.html) [Public domain], via Wikimedia Commons

—————————————————————————————————————————-

If you were not entirely indifferent to this post, please leave a comment.

Mimicking infants rather than adults – how infants choose their models.

The infant academy by Joshua Reynolds

The infant academy by Joshua Reynolds

Parents are often afraid of what happens once their children hit puberty and stop emulating their parents. Recent research suggests that this fear should start a lot earlier: in infancy. Of course, infants need their parents to learn but they need other infants when it comes to imitating things they already know.

Two recent articles by Zmyj from the Ruhr university in Bochum and colleagues present convincing evidence to back up infants’ occasional preference for peer imitation. First, when presented with videos of people playing with novel toys in familiar ways, fourteen month olds imitate a peer more than an older child aged 3.5 or an adult. Secondly, when presented with similar videos of people performing simple gestures (banging on the table, waving, clapping…) they again imitated a 14 month old more often than an older child or an adult.
These results are curious because at this age infants typically spend more time with their parents than with other infants. Furthermore, as far as imitation is used to learn new things the infants should prefer adults who are more knowledgeable. When it comes to novel actions the learning objective does actually prevail. Switching on a new lamp with the head or building a rattle is more likely to be copied from an adult model rather than an infant model (Seehagen & Herbert, 2011; Zmyj, Daum et al., 2012).
When it comes to infant-infant imitation, it may come out of a desire to belong to the same social group as the model, a sort of precursor to facebook’s Like button. Infant-adult imitation, on the other hand, may be more like a student-teacher relationship.
This set of studies powerfully shows that age matters to infants. They copy the behaviour of others depending on how old the model is and what sort of behaviour is shown. This sort of reasoning was long thought to be beyond 1 ½ year olds. Recent evidence, however, shows that infants play a more active part in choosing who to emulate than you may think.
————————————————————–

Seehagen, S.,, & Herbert, J.S. (2011). Infant Imitation From Televised Peer and Adult Models Infancy, 16 (2), 113-136 DOI: 10.1111/j.1532-7078.2010.00045.x

Zmyj, N., Aschersleben, G., Prinz, W., & Daum, M. (2012). The Peer Model Advantage in Infants’ Imitation of Familiar Gestures Performed by Differently Aged Models. Frontiers in psychology, 3 PMID: 22833732

Zmyj, N., Daum, M.M.,, Prinz, W.,, Nielsen, M.,, & Aschersleben, G. (2012). Fourteen-month-olds’ imitation of differently aged models Infant and Child Developement, 21 (3), 250-266 DOI: 10.1002/icd.750

 ————————————————————-
image: By Joshua Reynolds (jundurrahman.files.wordpress.com) [Public domain], via Wikimedia Commons
If you liked this post you may also like:
Infants choose their teachers

ResearchBlogging.org