Review of Heaven and Hell: The Psychology of the Emotions by Prof Peter Toohey, author of Jealousy and Boredom: A Lively History
22 Oct 2015 Leave a comment
Is the medicalization of human suffering doing more harm than good?
‘Mental disorder’ is difficult to define.
Generally speaking, mental disorders are conditions that involve either loss of contact with reality or distress and impairment. These experiences lie on a continuum of normal human experience, and so it is impossible to define the precise point at which they become pathological.
What’s more, concepts such as borderline personality disorder, schizophrenia, and depression listed in classifications of mental disorders may not map onto any real or distinct disease entities. Even if they do, the symptoms and clinical manifestations that define them are open to subjective judgement and interpretation.
In an attempt to address these problems, classifications of mental disorders such as DSM-5 and ICD-10 adopt a ‘menu of symptoms’ approach, and rigidly define each symptom in technical terms that are often far removed from a person’s felt experience. This encourages mental health professionals to focus too narrowly on validating and treating an abstract diagnosis, and not enough on the person’s distress, its context, and its significance or meaning.
Despite using complex aetiological models, mental health professionals tend to overlook that a person’s felt experience often has a meaning in and of itself, even if it is broad, complex, or hard to fathom. By being helped to discover this meaning, the person may be able to identify and address the source of his distress, and so to make a faster, more complete, and more durable recovery. Beyond even this, he may gain important insights into himself, and a more refined and nuanced perspective over his life and life in general. These are rare and precious opportunities, and not to be squandered.
A more fundamental problem with labelling human distress and deviance as mental disorder is that it reduces a complex, important, and distinct part of human life to nothing more than a biological illness or defect, not to be processed or understood, or in some cases even embraced, but to be ‘treated’ and ‘cured’ by any means possible—often with drugs that may be doing much more harm than good. This biological reductiveness, along with the stigma that it attracts, shapes the person’s interpretation and experience of his distress or deviance, and, ultimately, his relation to himself, to others, and to the world.
Moreover, to call out every difference and deviance as mental disorder is also to circumscribe normality and define sanity, not as tranquillity or possibility, which are the products of the wisdom that is being denied, but as conformity, placidity, and a kind of mediocrity.
The evolution of the status of homosexuality in the classifications of mental disorders highlights that concepts of mental disorder can be little more than social constructs that change as society changes. PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM and ICD, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.
Other pressing problems with the prevalent medical model is that it encourages false epidemics, most glaringly in depression, bipolar disorder, and ADHD. Data from the US National Health Interview Survey indicate that, in 2012, 13.5% (about 1 in 7) of boys aged 3-17 had been diagnosed with ADHD, up from 8.3% in 1997. It also encourages the wholesale exportation of Western mental disorders and Western accounts of mental disorder. Taken together, this is leading to a pandemic of Western disease categories and treatments, while undermining the variety and richness of the human experience.
For example, in her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.
Many critics question the scientific evidence underpinning such a robust biological paradigm and call for a radical rethink of mental disorders, not as detached disease processes that can be cut up into diagnostic labels, but as subjective and meaningful experiences grounded in personal and larger sociocultural narratives.
Unlike ‘mere’ medical or physical disorders, mental disorders are not just problems. If successfully navigated, they can also present opportunities. Simply acknowledging this can empower people to heal themselves and, much more than that, to grow from their experiences.
17 Oct 2015 Leave a comment
Generally speaking, culture-specific, or culture-bound, syndromes are mental disturbances that only find expression in certain cultures or ethnic groups, and that are not comfortably accommodated by Western psychiatric classifications such as the DSM and ICD. DSM-IV defined them as ‘recurrent, locality-specific patterns of aberrant behavior and troubling experience…’
One example of a culture-bound syndrome is dhat, which is seen in men from South Asia, and involves sudden anxiety about loss of semen in the urine, whitish discoloration of the urine, and sexual dysfunction, combined with feelings of weakness and exhaustion. The syndrome may originate in the Hindu belief that it takes forty drops of blood to create a drop of bone marrow, and forty drops of bone marrow to create a drop of semen, and thus that semen is a concentrated essence of life.
DSM-5, published in 2013, replaces the notion of culture-bound syndromes with three ‘cultural concepts of distress’: cultural syndromes, cultural idioms of distress, and cultural explanations for distress. Rather than merely listing specific cultural syndromes, DSM-5 adopts a broader approach to cultural issues, and acknowledges that all mental disorders, including DSM disorders, can be culturally shaped.
However, some DSM disorders are, it seems, much more culturally shaped than others. For instance, PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.
Thus, one criticism of classifications of mental disorders such as DSM and ICD is that, arm in arm with pharmaceutical companies, they encourage the wholesale exportation of Western mental disorders, and, more than that, the wholesale exportation of Western accounts of mental disorder, Western approaches to mental disorder, and, ultimately, Western values such as biologism, individualism, and the medicalization of distress and deviance.
In her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.
In Crazy Like Us: The Globalization of the American Psyche, journalist Ethan Watters shows how psychiatric imperialism is leading to a pandemic of Western disease categories and treatments. Watters argues that changing a culture’s ideas about mental disorder actually changes that culture’s disorders, and depletes the store of local beliefs and customs which, in many cases, provided better answers to people’s problems than antidepressants and anti-psychotics. For Watters, the most devastating consequence of our impact on other cultures is not our golden arches, but the bulldozing of the human psyche itself.
Looking at ourselves through the eyes of those living in places where human tragedy is still embedded in complex religious and cultural narratives, we get a glimpse of our modern selves as a deeply insecure and fearful people. We are investing our great wealth in researching and treating this disorder because we have rather suddenly lost other belief systems that once gave meaning and context to our suffering.
Distressed people are subconsciously driven to externalize their suffering, partly to make it more manageable, and partly so that it can be recognized and legitimized. According to medical historian Edward Shorter, our culture’s beliefs and narratives about illness provide us with a limited number of templates or models of illness by which to externalize our distress. If authorities such as psychiatrists and celebrities appear to endorse or condone a new template such as ADHD or deliberate self-harm, the template enters into our culture’s ‘symptom pool’ and the condition starts to spread. At the same time, tired templates seep out of the symptom pool, which may explain why conditions such as ‘hysteria’ and catatonic schizophrenia (schizophrenia dominated by extreme agitation or immobility and odd mannerisms and posturing) have become so rare.
The incidence of bulimia nervosa rose in 1992, the year in which journalist Andrew Morton exposed Princess Diana’s ‘secret disease’, and peaked in 1995, when she revealed her eating disorder to the public. It began to decline in 1997, the year of her tragic death. This synchronology suggests that Princess Diana’s status and glamour combined with intense press coverage of her bulimia and bulimia in general led to an increase in the incidence of the disorder.
An alternative explanation is that Princess Diana’s example encouraged people to come forward and admit to their eating disorder. By the same token, it could have been that the Japanese had always suffered from depression, but had been hiding it, or had not had a template by which to recognize or externalize it. The danger for us psychiatrists and health professionals when treating people with mental disorder is to treat the template without addressing or even acknowledging the very real distress that lies beneath.
Adapted from the new edition of The Meaning of Madness.
01 Oct 2015 Leave a comment
In his paper of 1943, A Theory of Human Motivation, psychologist Abraham Maslow proposed that healthy human beings had a certain number of needs, and that these needs are arranged in a hierarchy, with some needs (such as physiological and safety needs) being more primitive or basic than others (such as social and ego needs). Maslow’s so-called ‘hierarchy of needs’ is often presented as a five-level pyramid, with higher needs coming into focus only once lower, more basic needs have been met.
Maslow called the bottom four levels of the pyramid ‘deficiency needs’ because we do not feel anything if they are met, but become anxious or distressed if they are not. Thus, physiological needs such as eating, drinking, and sleeping are deficiency needs, as are safety needs, social needs such as friendship and sexual intimacy, and ego needs such as self-esteem and recognition. On the other hand, he called the fifth, top level of the pyramid a ‘growth need’ because our need to self-actualize enables us to fulfill our true and highest potential as human beings.
Once we have met our deficiency needs, the focus of our anxiety shifts to self-actualization, and we begin, even if only at a sub- or semi-conscious level, to contemplate our bigger picture. However, only a small minority of people is able to self- actualize because self-actualization requires uncommon qualities such as honesty, independence, awareness, objectivity, creativity, and originality.
Maslow’s hierarchy of needs has been criticized for being overly schematic and lacking in scientific grounding, but it presents an intuitive and potentially useful theory of human motivation. After all, there is surely some truth in the popular saying that one cannot philosophize on an empty stomach, or in Aristotle’s observation that, ‘all paid work absorbs and degrades the mind’.
Many people who have met all their deficiency needs do not self-actualize, instead inventing more deficiency needs for themselves, because to contemplate the meaning of their life and of life in general would lead them to entertain the possibility of their meaninglessness and the prospect of their own death and annihilation.
A person who begins to contemplate his bigger picture may come to fear that life is meaningless and death inevitable, but at the same time cling on to the cherished belief that his life is eternal or important or at least significant. This gives rise to an inner conflict that is sometimes referred to as ‘existential anxiety’ or, more colourfully, ‘the trauma of non-being’.
While fear and anxiety and their pathological forms (such as agoraphobia, panic disorder, or PTSD) are grounded in threats to life, existential anxiety is rooted in the brevity and apparent meaninglessness or absurdity of life. Existential anxiety is so disturbing and unsettling that most people avoid it at all costs, constructing a false reality out of goals, ambitions, habits, customs, values, culture, and religion so as to deceive themselves that their lives are special and meaningful and that death is distant or delusory.
However, such self-deception comes at a heavy price. According to Jean-Paul Sartre, people who refuse to face up to ‘non-being’ are acting in ‘bad faith’, and living out a life that is inauthentic and unfulfilling. Facing up to non-being can bring insecurity, loneliness, responsibility, and consequently anxiety, but it can also bring a sense of calm, freedom, and even nobility. Far from being pathological, existential anxiety is a sign of health, strength, and courage, and a harbinger of bigger and better things to come.
For theologian Paul Tillich (1886-1965), refusing to face up to non-being leads not only to a life that is inauthentic but also to pathological (or neurotic) anxiety.
In The Courage to Be, Tillich asserts:
He who does not succeed in taking his anxiety courageously upon himself can succeed in avoiding the extreme situation of despair by escaping into neurosis. He still affirms himself but on a limited scale. Neurosis is the way of avoiding nonbeing by avoiding being.
According to this outlook, pathological anxiety, though seemingly grounded in threats to life, in fact arises from repressed existential anxiety, which itself arises from our uniquely human capacity for self-consciousness.
Facing up to non-being enables us to put our life into perspective, see it in its entirety, and thereby lend it a sense of direction and unity. If the ultimate source of anxiety is fear of the future, the future ends in death; and if the ultimate source of anxiety is uncertainty, death is the only certainty. It is only by facing up to death, accepting its inevitability, and integrating it into life that we can escape from the pettiness and paralysis of anxiety, and, in so doing, free ourselves to make the most out of our lives and out of ourselves.
Adapted from the new edition of The Meaning of Madness
19 Aug 2015 2 Comments
Is the medical model still helping?
In the UK, mental ill healthis recognized as the single largest cause of disability, contributing almost 23 per cent of the disease burden and costing over £100 billion ($157 billion) a year in services, lost productivity, and reduced quality of life. Every year in the EU, about 27 per cent of adults are affected by mental disorder of some kind. In the US, almost one in two people will meet the criteria for a mental disorder in the course of their lifetime. Data from the US National Health Interview Survey indicate that, in 2012, 13.5% of boys aged 3-17 had been diagnosed with attention deficit hyperactivity disorder (ADHD), up from just 8.3% in 1997.
There is no denying that a lot of people are suffering. But are they all really suffering from a mental disorder, that is, a medical illness, a biological disorder of the brain? And if not, are doctors, diagnoses, and drugs necessarily the best response to their problems?
Since 1952, the number of diagnosable mental disorders has burgeoned from 106 to over 300, and now includes such constructs as ‘gambling disorder’, ‘minor neurocognitive disorder’, ‘disruptive mood dysregulation disorder’, ‘premenstrual dysphoric disorder’, and ‘binge-eating disorder’.
According to a recent report, antidepressant prescriptions in England rose from 15 million items in 1998 to 40 million in 2012, this despite the mounting evidence for their ineffectiveness. Selective serotonin reuptake inhibitors (SSRIs) in particular have become something of a panacea, used not only to treat depression, but also to treat anxiety disorders, obsessive-compulsive disorder, and bulimia nervosa, and even some physical disorders such as premature ejaculation in young men and hot flushes in menopausal women. In the UK, the SSRI fluoxetine is so commonly prescribed that trace quantities have been detected in the water supply.
But despite all this apparent progress in diagnosis and treatment, people who meet the diagnostic criteria for such a paradigmatic mental disorder as schizophrenia tend to fare better in resource-poor countries, where human distress can take on very different forms and interpretations to those outlined in our scientifical classifications.
Psychiatry is in a crisis precipitated by its own success, and, assuming that it once did, the medical or biological model is no longer helping. The specialty of the heart is cardiology, the specialty of the digestive tract is gastroenterology, and the specialty of the brain is neurology and psychiatry. But neurology is not psychiatry, which literally means ‘healing of the soul’.
Some mental disorders undeniably have a strong biological basis, but even these have many more aspects and dimensions than ‘mere’ physical disorders.
It is high time to fundamentally rethink our approach to mental disorders and mental ‘dis-ease’.
The Second Edition of The Meaning of Madness, due out in September, is available for pre-order.
31 May 2015 Leave a comment
We are being lazy if we are able to carry out some activity that we ought to carry out, but are disinclined to do so on account of the effort involved. Instead, we remain idle, carry out the activity perfunctorily, or engage in some other less strenuous or boring activity. In short, we are being lazy if our motivation to spare ourselves effort trumps our motivation to do the right or best or expected thing—assuming, of course, that we know, or think that we know, what that is.
Synonyms for laziness include indolence and sloth. Indolence derives from the Latin indolentia, ‘without pain’ or ‘without taking trouble’. Sloth has more moral and spiritual overtones than either laziness or indolence. In the Christian tradition, sloth is one of the seven deadly sins (the other six being lust, gluttony, greed, wrath, envy, and pride) because it undermines society and God’s plan and invites all manner of sin. The Bible inveighs against slothfulness, notably in the Book of Ecclesiastes: ‘By much slothfulness the building decayeth; and through idleness of the hands the house droppeth through. A feast is made for laughter, and wine maketh merry: but money answereth all things.’
Laziness should not be confused with either procrastination or idleness. To procrastinate—from the Latin cras, ‘tomorrow’—is to postpone one task in favour of another or others which are perceived as being easier or more pleasurable but which are typically less important or urgent. To postpone a task for constructive or strategic purposes does not amount to procrastination. For a postponement to amount to procrastination, it has to represent poor or ineffective planning and result in a higher overall cost to the procrastinator, for example, in the form of stress, guilt, lost productivity, or lost opportunities. It is one thing to delay a tax return until all the numbers are in, but quite another to delay it so that it upsets our holiday plans and lands us with a fine. Both the lazybones and the procrastinator lack motivation, but unlike the lazybones the procrastinator aspires and intends to complete the task under consideration, and, moreover, eventually does complete it, albeit at a higher cost to himself.
To be idle is, not to be doing anything. Idleness is often romanticized, as epitomized by the Italian expression dolce far niente (‘it is sweet to do nothing’). Many people tell themselves that they work hard from a desire for idleness. But although our natural instinct is for idleness, most of us find prolonged idleness difficult to bear. Queuing for half an hour in a traffic jam can leave us feeling bored, restless, and irritable, and many motorists prefer to make a detour even if the alternative route is likely to take longer than sitting through the traffic. Recent research suggests that people will find the flimsiest excuse to keep busy, and that they feel happier for keeping busy even when their busyness is imposed upon them. In their research paper (Hsee CK et al. (2010), Idleness aversion and the need for justifiable busyness. Psychological Science 21(7): 926–930.), Christopher Hsee and his colleagues surmise that many of our purported goals may be little more than justifications for keeping busy.
We could be idle because we have nothing to do—or rather, because we lack the imagination to think of something to do. If we do evidently have something to do, we could be idle because we are lazy, but also because we are unable to do that thing, or because we have already done it and are resting and recuperating. Lastly, we could be idle because we value idleness or its products above whatever it is we have to do, which is not the same thing as being lazy. Lord Melbourne, Queen Victoria’s favourite prime minister, extolled the virtues of ‘masterful inactivity’. As chairman and CEO of General Electric, Jack Welch spent an hour each day in what he called ‘looking out of the window time’. Adepts of such strategic idleness use their ‘idle’ moments, among others, to gather inspiration, develop and maintain perspective, sidestep nonsense and pettiness, reduce inefficiency and half-living, and conserve health and stamina for truly important tasks and problems. ‘To do nothing at all,’ said Oscar Wilde, ‘is the most difficult thing in the world, the most difficult and the most intellectual.’
Adapted from Heaven and Hell: The Psychology of the Emotions.
22 May 2015 1 Comment
Patience can be regarded as a decision-making problem: eat up all the grain today or plant it in the earth and wait for it to multiply. Unfortunately, human beings evolved not as farmers but as hunter-gatherers, and have a strong tendency to discount long-term rewards. Our ancestral shortsightedness is borne out by the Stanford marshmallow experiment, a series of studies on delayed gratification led by Walter Mischel in the late 1960s and early 1970s. These studies, conducted on hundreds of mostly four- and five-year old children, involved a simple binary choice: either eat this marshmallow, or hold back for 15 minutes and be given a second marshmallow. Having explained this choice to a child, the experimenter left him alone with the marshmallow for 15 minutes. Follow-up studies carried out over 40 years found that the minority of children who had been able to hold out for a second marshmallow went on to enjoy significantly better life outcomes, including higher test scores, better social skills, and less substance misuse.
Even so, patience involves much more than the mere ability to hold back for some future gain. Exercising patience (note the use of the verb ‘to exercise’) can be compared to dieting or growing a garden. Yes, waiting is involved, but one also needs to have a plan in place, and, moreover, to work at that plan. Thus, when it comes to others, patience does not amount to mere restraint or toleration, but to a complicit engagement in their struggle and welfare. In that much, patience is a form of compassion, which, rather than disregarding and alienating people, turns them into friends and allies.
If impatience implies impotence, patience implies power, power born out of understanding. Rather than make us into a hostage to fortune, patience frees us from frustration and its ills, delivers us to the present moment, and affords us the calm and perspective to think, say, and do the right thing in the right way at the right time—which is why, with psychotherapy, both patient and therapist can require several years together. Last but not least, patience enables us to achieve things that would otherwise have been impossible to achieve. As La Bruyère put it, ‘There is no road too long to the man who advances deliberately and without undue haste; there are no honours too distant to the man who prepares himself for them with patience.’ Exercising patience does not mean never protesting or giving up, but only ever doing so in a considered fashion: never impetuously, never pettily, and never pointlessly. Neither does it mean withholding, just like ageing a case of fine wine for several years does not mean withholding from wine during all that time. Life is too short to wait, but it is not too short for patience.
Patience is much easier, perhaps even pleasant, to exercise if one truly understands that it can and does deliver much better outcomes, not just for ourselves but for others too. In 2012, researchers at the University of Rochester replicated the marshmallow experiment. However, before doing so, they split the participating children into two groups, exposing one group to unreliable experiences in the form of broken promises, and the other to reliable experiences in the form of honoured promises. They subsequently found that the children exposed to honoured promises waited an average of four times longer than the children exposed to broken promises.
In other words, patience is largely a matter of trust, or, some might say, faith.
Mischel W et al. (1972): Cognitive and attentional mechanisms in delay of gratification. Journal of Personality and Social Psychology 21(2): 204–218.
J de la Bruyère (1688), Les Caractères, Des jugements, aphorism 108.
Kidd C et al. (2013): Rational snacking: Young children’s decision-making on the marshmallow task is moderated by beliefs about environmental reliability. Cognition 126(1):109–114.
Adapted from Heaven and Hell: The Psychology of the Emotions.