Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-28T02:39:52.121Z Has data issue: false hasContentIssue false

Kaleidoscope

Published online by Cambridge University Press:  01 August 2023

Rights & Permissions [Opens in a new window]

Abstract

Type
Kaleidoscope
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of the Royal College of Psychiatrists

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness. Perhaps Dickens was right about the dawning of the French revolution, but surely we're now in an age of social-media-inspired incivility and broader, spiralling global decline? Fascists launch war against their neighbours in Europe, COVID-rule-breaking politicians award honours to their hairdressers, and stealers of state secrets seek re-election. Well, in a magnificent paper in Nature, Mastrolianni and Gilbert say ‘twas ever thus, and we all share an illusion of moral decline.Reference Mastroianni and Gilbert1 Taking data of over 12 million individuals from various studies encompassing 60 countries across a 70-year time span, they show that people have consistently believed in a continuing moral deterioration with time. Testing people now showed that they believed folk were kinder, more honest, nicer and better a decade ago, and at the time when they were aged 20. There has been a shared perception of both decreasing morality of other people as they age and a moral decline of those pesky young people who follow behind us. In other words, society itself is going down the drain, and individuals are also getting worse as they age. Interestingly, people tend not to believe that there is as much of a decline of their own contemporaries – it would appear that, like tastes in music, only your generation ever really nailed it. The problem is, most objective markers suggest that things have improved over the ages: from slavery, subjugation of others and murders, through to warfare – without wishing to downplay the current horrific crisis in Ukraine. What's really clever about the piece is the unpicking of what drives this, and indeed how such perceptions can be increased, reversed or eliminated, via two well-known phenomena: biased exposure to information and biased memory for information. We humans seek, attend to and remember, negative information about others. Two and a half thousand years ago Socrates bemoaned ‘The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise’. This paper shows the ubiquity of this perception and how it's easily reproduced. But the key ‘so what’ factor is what we might do with this. Critically, the authors note the enormous social implications, often fostered by bad-faith politicians, of directing attention and resources to reversing an imaginary trend, often against minority groups. In reality, well, as The Who said, the kids are alright.

‘Atypical’ or ‘typical’ is a rather crude and often unhelpful delineation of antipsychotic medications: are there more clinically useful ways to describe these medications? We're all aware of the historical reasons behind the terminology, but Rob McCutcheon and colleaguesReference McCutcheon, Harrison, Howes, McGuire, Taylor and Pillinger2 call for a data-driven approach to guide clinicians and research based on differential receptor affinity. This might support better informed decision-making when switching medications that are inadequately effective or creating intolerable side-effects. Taking the affinities of 27 antipsychotics across 42 receptors, they applied a machine-learning model to over 3000 in vitro receptor-binding studies. Their data clustered out four groups, which they label ‘clinical effect fingerprints’: muscarinic (M2–M5) antagonism with cholinergic and metabolic side-effects; dopaminergic (D2) partial agonists with adrenergic antagonism and generally low side-effect burden; serotonergic and dopaminergic antagonism with a moderate side-effect burden; and dopaminergic antagonism with extrapyramidal side-effects and hyperprolactinaemia. The first and last of these groups were generally most efficacious. Their classification model was able to predict effects of individual drugs on individuals not in the originally included studies. What's useful here is that the ‘return’ to pharmacological make-up and receptor affinity isn't just for its own technical sake – it informs care. Helpfully, the authors have opened their data and made it freely available for others, and this can be updated as new drugs are developed.

May's KaleidoscopeReference Tracy, Albertson, Cordoba and Shergill3 reported longitudinal data showing that alcohol consumption was associated with reduced rates of depression: it felt too good to be true, and we've an August hangover. Hammerton et alReference Hammerton, Lewis, Heron, Fernandes, Hickman and Lewis4 also take a prospective data-set (the UK ALSPAC cohort), but they explore alcohol consumption during adolescence, with a focus specifically on dependency. It's a critical life-period for lots of reasons, with rates of depression rising from ages 13 to 19 years and emerging evidence that these figures are on the increase with time. Interestingly, adolescent alcohol consumption has fallen in the same cohort over the past few decades, though rates of those suffering harm from it have nevertheless remained stubbornly stable. It's always a challenge to test the directionality of any links between these two problems. One can anticipate how alcohol consumption might drive depression through biological, psychological and/or social adverse outcomes and, conversely, how individuals might ‘self-medicate’ low mood via drinking.

In this study, data were explored in just under 4000 individuals, with measurements approximately annually between the ages of 16 and 23. It's noteworthy how few studies have been done on such populations, despite it being the most common time of life for alcohol consumption to commence. The primary outcome measure was depression at age 24. The authors found an association between alcohol dependency at age 18 and depression at this later time point. This held when various socioeconomic and other confounders – including sex, parental alcohol use and housing tenure – were controlled for. Their data are not necessarily in conflict with the study in May's BJPsych. Indeed, they found no evidence of an association of consumption quantity or frequency with depression if there was not also dependency. Nevertheless, dependency is commonly preceded by higher rates and quantity of use, and we'd all support a message of caution and moderation with alcohol, whichever study is your preferred tipple.

Much has been written on the challenges of valid but clinically practically limited epidemiological risk factors for suicide: what about short-term acute ‘warning signs’? Bagge et alReference Bagge, Littlefield, Wiegand, Hawkins, Trim and Schumacher5 label warning signs as within-individual factors that can delineate periods of higher and lower risk of suicidal behaviour. They report on what they believe to be the first controlled study exploring these in psychiatric in-patients who had attempted suicide. A within-person case-crossover methodology allowed the 349 participants at a single site to act as their own controls. Warning signs were retrospectively measured through a structured interview that looked for warning sign factors that were present or absent, or increased in frequency or intensity, in the 6 h prior to a suicide attempt, compared with the control of the 6 h the day before. Risk warning signs identified were: preparing personal affairs, suicide-related communication, alcohol consumption, the occurrence of a negative life event, and increases in particular affective and cognitive responses (including a sense of emptiness and burdensomeness). They found no differences between genders. Notably, ‘any preparation of personal affairs’ – paying off bills, arranging for the care of loved ones, giving away possessions, writing or revising a will – had a tenfold greater risk than any other warning sign. However, the authors emphasise the practical challenges in that even here, only 10% of included individuals made such efforts before their suicide attempt. We are familiar with the concepts of dynamic and static risk factors, and there are overlaps here with the former; the authors call this the critical question of ‘why today?’. This leads to the question of how active and specific we are or might be in monitoring, especially in in-patient units, and how effective targeted interventions might be.

There's been a lot on evolution and psychiatry in recent Kaleidoscopes. That's because I love the topic – go write your own column if you don't like it. Anyway, to a hotly debated preprint that will potentially rewrite our understanding of what it means to be human. The ‘linear march of progress’ model of human evolution – the one on t-shirts and posters of ever more erect apes leading to us – is wrong. For most of human existence, there have been multiple hominin species alive at the same time, and we might be an accidental survivor rather than an imperious evolutionary ‘winner’. Nevertheless, a general pattern has been enlarging cranial and brain volumes over time, linking with our ever-growing cognitive abilities. The past 20 years has thrown some real curve-balls into the story, not least Homo floresiensis (the ‘hobbit’ micro-species found in Indonesia) and, more recently, southern Africa's Homo naledi. H. naledi is fascinating for its combination of a relatively modern post-cranial (i.e. below the neck) physiology, largely fitting with more recent human species, yet an archaic brain capacity one-third the size of ours. Adding to the confusion, it was alive just 300 000 years ago and was a contemporary of emerging Homo sapiens. And we haven't even reached the astonishing news yet. Lee Berger, the sometimes-controversial discoverer of H. naledi, and colleagues have describedReference Fuentes, Kissel, Spikins, Molopyane, Hawks and Berger6 new findings suggesting that this species both deliberately buried its dead and left symbolic carvings on nearby cave walls. It's really difficult to overstate the implications. If true, a human species with a brain comparable with a chimpanzee's had the cognitive capacity to contemplate and act on the deaths of members of their community and left symbolic carvings of this. It redefines what it means to be human, overthrowing assumptions on when this first occurred and asks why we have and need the large brains we possess. Peer review is yet to occur: expect much academic ink and heat to be spilled on this in the coming years.

Finally, I started this month with talk of ‘age’ and ‘wisdom’; in a society fixated on youth, a stimulating ethics paper asksReference Garcia-Barranquero, Llorca Albareda and Diaz-Cobacho7: ‘is ageing undesirable’? I also quoted The Who earlier, and Roger Daltrey once sang that he hoped he died before he got old; he would appear to have revised his opinion on this. It's a question as old as civilization (and perhaps earlier, depending on what H. naledi might have pondered). It is perhaps heightened in an era of medicine pushing back not just life expectancy but also health during years lived, with emerging ‘geroscience’ hinting at cellular, genetic and molecular keys to the underlying processes of ageing itself. The paper challenges what might be lost if ageing – even if not death – might actually be avoidable. Three core arguments against preventing ageing are introduced: we are reminded that ageing is not a disease and something to be ‘fixed’; extending life indefinitely can bring about its own adverse consequences; and there are valuable and rewarding experiences to be had within this life stage. The authors propose that the seeming paradox of ageing – its carrying both desirable and undesirable components – arises from failure to distinguish its two distinct dimensions: the chronological and the biological, with much positive coming from the former. The technical advances of medicine are held to contain ethical challenges that are not currently being adequately addressed: just because we can, does it mean we should? I've had a whimsical sense of linking the human and the eternal this month, so I'll leave the last words to Seneca8: ‘We should cherish old age and enjoy it. It is full of pleasure if you know how to use it. Fruit tastes most delicious just when its season is ending. The charms of youth are at their greatest at the time of its passing.’

References

Mastroianni, AM, Gilbert, DT. The illusion of moral decline. Nature 2023; 618: 782–9.CrossRefGoogle ScholarPubMed
McCutcheon, RA, Harrison, PJ, Howes, OD, McGuire, PK, Taylor, DM, Pillinger, T. Data-driven taxonomy for antipsychotic medication: a new classification system. Biol Psychiatry [Epub ahead of print] 14 Apr 2023. Available from: https://doi.org/10.1016/j.biopsych.2023.04.004.CrossRefGoogle Scholar
Tracy, DK, Albertson, DN, Cordoba, EL, Shergill, SS. Kaleidoscope. Br J Psychiatry 2023; 222(5): 224–5.CrossRefGoogle ScholarPubMed
Hammerton, G, Lewis, G, Heron, J, Fernandes, G, Hickman, M, Lewis, G. The association of alcohol dependence and consumption during adolescence with depression in young adulthood, in England: a prospective cohort study. Lancet Psychiatry 2023; 10(7): 490–8.10.1016/S2215-0366(23)00138-4CrossRefGoogle ScholarPubMed
Bagge, CL, Littlefield, AK, Wiegand, TJ, Hawkins, E, Trim, RS, Schumacher, JA, et al. A controlled examination of acute warning signs for suicide attempts among hospitalized patients. Psychol Med 2022; 53(7): 19.Google ScholarPubMed
Fuentes, A, Kissel, M, Spikins, P, Molopyane, K, Hawks, J, Berger, LR. Burials and engravings in a small-brained hominin, Homo naledi, from the late Pleistocene: contexts and evolutionary implications. BioRxiv [Preprint] 2023. Available from: https://www.biorxiv.org/content/10.1101/2023.06.01.543135.10.7554/eLife.89125.1CrossRefGoogle Scholar
Garcia-Barranquero, P, Llorca Albareda, J, Diaz-Cobacho, G. Is ageing undesirable? An ethical analysis. J Med Ethics [Epub ahead of print] 7 Jun 2023. Available from: https://doi.org/10.1136/jme-2022-108823.CrossRefGoogle Scholar
Seneca. Epistulae Morales Ad Lucilium. Penguin, 2004.Google Scholar
Submit a response

eLetters

No eLetters have been published for this article.