adolescence

Is ADHD different around the globe? The role of research cultures

An illness is an illness wherever you are. Perhaps this is true for organic diseases but the cultural background can play a tremendous role in the progression and even diagnosis of mental disorders (see e.g., David Dobbs recent post at Wired). However, what has been neglected is an appreciation of how culture affects the research underlying the diagnosis and treatment of psychological disorders. As a consequence, our view on the disorder can change.

Attention deficit hyperactivity disorder shows how culture can exert quite some effect on psychiatric research. In a 2007 meta-analysis by Polanczyk and colleagues prevalence rates were found to differ markedly between geographic regions, but not in the way you might expect. As opposed to the myth of ADHD as an American social construct, European and North American ADHD rates were not significantly different. But both were significantly different to the prevalence rates in Africa and the Middle East.
ADHD in school

One case of ADHD. Or perhaps two. Depends where we are.

However, Polanczyk and colleagues state that this is most likely due to different criteria for diagnosis and study inclusion. For example, while the diagnostic system published by the World Health Organisation is quite strict, the one published by the American Psychiatric Association is more liberal. Depending on which one the researchers adopt, the same person could be part of the ADHD group in one study and the control group in another one.
These different inclusion criteria appear to bias international comparisons. The severe restrictions on ADHD diagnosis in Middle Eastern studies can increase the apparent ADHD severity and social problems. Don’t be surprised then if you read that Middle Eastern ADHD kids fare worse in life than their American counterparts.
Beyond different inclusion criteria, the focus of studies can differ by geographic region. In a recent review, Hodgkins et al. (2012) showed that about half of North American and European ADHD studies sampled adults. East Asian researchers, on the other hand, were mainly interested in adolescents and only sampled adults in about a third of studies. Will this result in ADHD as a potentially life long disease in the Western view while the Easten perspective sees it as part of the transition to adulthood? If so, researchers could be partly to blame for this difference.
Finally, what life consequences a ADHD diagnosis entails is differently researched. While East Asians are mainly interested in effects on self-esteem, Europeans focus more on antisocial behaviour. North American researchers, on the other hand, measure drug abuse and addiction outcomes more than their European or East Asian counterparts. A single headline grabbing result could forever associate inattentive kids with drug abuse. Don’t expect this result to emerge in Asia, it is likely to be found in the US.
This is not to say that ADHD, its prevalence in different age groups or its life consequences are entirely determined by research agendas. Evidence is still needed to support diagnosis or treatment. However, whether anyone ever looked for this evidence is dependent on culture. Across the world research cultures, i.e. strategies to get scientific evidence, differ. Don’t be surprised then if evidence based psychiatry differs as well.

———————————————————————————————————–
Hodgkins P, Arnold LE, Shaw M, Caci H, Kahle J, Woods AG, & Young S (2011). A systematic review of global publication trends regarding long-term outcomes of ADHD. Frontiers in psychiatry / Frontiers Research Foundation, 2 PMID: 22279437

Polanczyk G, de Lima MS, Horta BL, Biederman J, & Rohde LA (2007). The worldwide prevalence of ADHD: a systematic review and metaregression analysis. The American journal of psychiatry, 164 (6), 942-8 PMID: 17541055
————————————————————————————————————
ResearchBlogging.org
images:

1) By CDC (http://www.cdc.gov/ncbddd/adhd/facts.html) [Public domain], via Wikimedia Commons

—————————————————————————————————————————-

If you were not entirely indifferent to this post, please leave a comment.

Advertisements

Risk vs. Opportunity across the life-span: Risky choices decline with age

Risk taking is somewhat enigmatic. On the one hand, risky choices in every day life – like drug abuse or drink driving – peak in adolescence. Never again in life is the threat to die from easily preventable causes as great. On the other hand, in laboratory experiments this risky choice peak in adolescence is absent. Instead, the readiness to take a gamble simply goes down the older you are. How can we explain this paradox? Perhaps, we should look at a tribe in the Amazon rain forest for answers.

A group of psychologists from Duke University led by David Paulsen looked at risk taking in the laboratory. Participants had the choice between either a guaranteed mediocre reward (say, four coins) or a gamble with a 50/50 chance of getting a low (e.g., two coins) or a high (e.g., six coins) reward. This is reminiscent of many choices we face in life: do you prefer ‘better safe than sorry’ or ‘high risk/high gain’? As you can see in their figure below, Paulsen and colleagues found adolescents to be greater risk seekers than adults. No matter how risky the gamble, adolescents choose it more often compared to adults.
risk taking across age groups

‘Better save than sorry’ vs. ‘High risk – high gain’

Paradoxically, children are even more risk prone than adolescents. Moreover, the riskier the gamble the greater the difference to older people. Paulsen and colleagues have trouble explaining why risky choices in the laboratory do not show an adolescent peak which so many real world behaviours show. Could it have to do with laboratory risk being clearly defined while real world risk is unknown? Is it peer influencing which drives real world riskiness but is absent in the laboratory? Is there more thrill in real risk taking while lab experiments are so boring that thrill seeking doesn’t come into play?
Perhaps. However, one explanation – which I, personally, found totally obvious – is not even discussed. Risky choices decline with age, true. But the opportunity to make risky choices increases with age. In Western society there are both explicit laws as well as implicit norms that prevent children from the opportunity to take risks. Take as an example alcohol abuse. Many people perceive a party without alcohol as mediocre. With alcohol, however, you take a gamble between doing something very regrettable (read, low reward) or having the time of your life (read, high reward).
Amazon rainforest

Where to test an alternative explanation: the real world.

How does this play out across the life span? It is inconceivable to serve beers at children’s birthday parties. However, the older you are the more you choose yourself what is served at your parties. When you are a young adolescent this increased risk taking opportunity meets a still high (but declining) risk taking readiness and you get wasted.
So, with age, risk taking goes down because the opportunities to take risks do not get more after a certain age while the readiness to take these risks still declines. The outcome would be a peak in real life risk taking at adolescence despite a linear decline in risky choices, i.e. exactly the observed pattern.
This interaction between risk taking opportunities and risk taking readiness is nicely illustrated by a native American tribe Dan Everett described in his very readable book Don’t Sleep, There are Snakes. The Pirahã do not have the Western notion of childhood. Everett writes that ‘children are just human beings in Pirahã society, as worthy of respect as any fully grown human adult. They are not seen as in need of coddling or special protections.’ (p.89). As a consequence, ‘there is no prohibition that applies to children that does not equally apply to adults and vice versa’ (p.97).
What does this mean for child alcohol consumption on the infrequent occasions when alcohol is available to the tribe? This episode gives the answer (p. 98):
Once a trader gave the tribe enough cachaça [alcohol] for everyone to get drunk. And that is what happened. Every man, woman and child in the village got falling-down wasted. Now, it doesn’t take much alcohol for Pirahãs to get drunk. But to see six-year-olds staggering with slurred speech was a novel experience for me.
So, perhaps this solves the paradox. The laboratory results were unrealistic by Western standards because they gave children a choice which they usually do not have: sure reward or gamble? Once you look at societies that do give children this choice you see that the laboratory results line up better with real life.
There is much to be learned by going beyond the laboratory and looking at the real world. The entire real world.

—————————————————————————————————–

Everett, D. (2008). Don’t sleep, there are snakes. London: Profile Books

Paulsen, D.J., Platt, M.L., Huettel, SA, & Brannon, E.M. (2012). From risk-seeking to risk-averse: the development of economic risk preference from childhood to adulthood. Frontiers in psychology, 3 PMID: 22973247

—————————————————————————————————–

images:

1) as found in Paulsen et al. (2012)

2) By Jorge.kike.medina (Own work) via Wikimedia Commons

ResearchBlogging.org

.

.

.

.

—————————————————————————————————

If you were not entirely indifferent to this post, please leave a comment.