The Problems of Psychiatry

Is the medicalization of human suffering doing more harm than good?

‘Mental disorder’ is difficult to define.

Generally speaking, mental disorders are conditions that involve either loss of contact with reality or distress and impairment. These experiences lie on a continuum of normal human experience, and so it is impossible to define the precise point at which they become pathological.

What’s more, concepts such as borderline personality disorder, schizophrenia, and depression listed in classifications of mental disorders may not map onto any real or distinct disease entities. Even if they do, the symptoms and clinical manifestations that define them are open to subjective judgement and interpretation.

In an attempt to address these problems, classifications of mental disorders such as DSM-5 and ICD-10 adopt a ‘menu of symptoms’ approach, and rigidly define each symptom in technical terms that are often far removed from a person’s felt experience. This encourages mental health professionals to focus too narrowly on validating and treating an abstract diagnosis, and not enough on the person’s distress, its context, and its significance or meaning.

Despite using complex aetiological models, mental health professionals tend to overlook that a person’s felt experience often has a meaning in and of itself, even if it is broad, complex, or hard to fathom. By being helped to discover this meaning, the person may be able to identify and address the source of his distress, and so to make a faster, more complete, and more durable recovery. Beyond even this, he may gain important insights into himself, and a more refined and nuanced perspective over his life and life in general. These are rare and precious opportunities, and not to be squandered.

A more fundamental problem with labelling human distress and deviance as mental disorder is that it reduces a complex, important, and distinct part of human life to nothing more than a biological illness or defect, not to be processed or understood, or in some cases even embraced, but to be ‘treated’ and ‘cured’ by any means possible—often with drugs that may be doing much more harm than good. This biological reductiveness, along with the stigma that it attracts, shapes the person’s interpretation and experience of his distress or deviance, and, ultimately, his relation to himself, to others, and to the world.

Moreover, to call out every difference and deviance as mental disorder is also to circumscribe normality and define sanity, not as tranquillity or possibility, which are the products of the wisdom that is being denied, but as conformity, placidity, and a kind of mediocrity.

The evolution of the status of homosexuality in the classifications of mental disorders highlights that concepts of mental disorder can be little more than social constructs that change as society changes. PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM and ICD, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.

Other pressing problems with the prevalent medical model is that it encourages false epidemics, most glaringly in depression, bipolar disorder, and ADHD. Data from the US National Health Interview Survey indicate that, in 2012, 13.5% (about 1 in 7) of boys aged 3-17 had been diagnosed with ADHD, up from 8.3% in 1997. It also encourages the wholesale exportation of Western mental disorders and Western accounts of mental disorder. Taken together, this is leading to a pandemic of Western disease categories and treatments, while undermining the variety and richness of the human experience.

For example, in her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.

Many critics question the scientific evidence underpinning such a robust biological paradigm and call for a radical rethink of mental disorders, not as detached disease processes that can be cut up into diagnostic labels, but as subjective and meaningful experiences grounded in personal and larger sociocultural narratives.

Unlike ‘mere’ medical or physical disorders, mental disorders are not just problems. If successfully navigated, they can also present opportunities. Simply acknowledging this can empower people to heal themselves and, much more than that, to grow from their experiences.

Psychiatric Imperialism: Exporting Western Mental Disorders

Generally speaking, culture-specific, or culture-bound, syndromes are mental disturbances that only find expression in certain cultures or ethnic groups, and that are not comfortably accommodated by Western psychiatric classifications such as the DSM and ICD. DSM-IV defined them as ‘recurrent, locality-specific patterns of aberrant behavior and troubling experience…’

One example of a culture-bound syndrome is dhat, which is seen in men from South Asia, and involves sudden anxiety about loss of semen in the urine, whitish discoloration of the urine, and sexual dysfunction, combined with feelings of weakness and exhaustion. The syndrome may originate in the Hindu belief that it takes forty drops of blood to create a drop of bone marrow, and forty drops of bone marrow to create a drop of semen, and thus that semen is a concentrated essence of life.

DSM-5, published in 2013, replaces the notion of culture-bound syndromes with three ‘cultural concepts of distress’: cultural syndromes, cultural idioms of distress, and cultural explanations for distress. Rather than merely listing specific cultural syndromes, DSM-5 adopts a broader approach to cultural issues, and acknowledges that all mental disorders, including DSM disorders, can be culturally shaped.

However, some DSM disorders are, it seems, much more culturally shaped than others. For instance, PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.

Thus, one criticism of classifications of mental disorders such as DSM and ICD is that, arm in arm with pharmaceutical companies, they encourage the wholesale exportation of Western mental disorders, and, more than that, the wholesale exportation of Western accounts of mental disorder, Western approaches to mental disorder, and, ultimately, Western values such as biologism, individualism, and the medicalization of distress and deviance.

In her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.

In Crazy Like Us: The Globalization of the American Psyche, journalist Ethan Watters shows how psychiatric imperialism is leading to a pandemic of Western disease categories and treatments. Watters argues that changing a culture’s ideas about mental disorder actually changes that culture’s disorders, and depletes the store of local beliefs and customs which, in many cases, provided better answers to people’s problems than antidepressants and anti-psychotics. For Watters, the most devastating consequence of our impact on other cultures is not our golden arches, but the bulldozing of the human psyche itself.

He writes:

Looking at ourselves through the eyes of those living in places where human tragedy is still embedded in complex religious and cultural narratives, we get a glimpse of our modern selves as a deeply insecure and fearful people. We are investing our great wealth in researching and treating this disorder because we have rather suddenly lost other belief systems that once gave meaning and context to our suffering.

Distressed people are subconsciously driven to externalize their suffering, partly to make it more manageable, and partly so that it can be recognized and legitimized. According to medical historian Edward Shorter, our culture’s beliefs and narratives about illness provide us with a limited number of templates or models of illness by which to externalize our distress. If authorities such as psychiatrists and celebrities appear to endorse or condone a new template such as ADHD or deliberate self-harm, the template enters into our culture’s ‘symptom pool’ and the condition starts to spread. At the same time, tired templates seep out of the symptom pool, which may explain why conditions such as ‘hysteria’ and catatonic schizophrenia (schizophrenia dominated by extreme agitation or immobility and odd mannerisms and posturing) have become so rare.

The incidence of bulimia nervosa rose in 1992, the year in which journalist Andrew Morton exposed Princess Diana’s ‘secret disease’, and peaked in 1995, when she revealed her eating disorder to the public. It began to decline in 1997, the year of her tragic death. This synchronology suggests that Princess Diana’s status and glamour combined with intense press coverage of her bulimia and bulimia in general led to an increase in the incidence of the disorder.

An alternative explanation is that Princess Diana’s example encouraged people to come forward and admit to their eating disorder. By the same token, it could have been that the Japanese had always suffered from depression, but had been hiding it, or had not had a template by which to recognize or externalize it. The danger for us psychiatrists and health professionals when treating people with mental disorder is to treat the template without addressing or even acknowledging the very real distress that lies beneath.

Adapted from the new edition of The Meaning of Madness.

Find me on Twitter and Facebook.

A Philosophical Cure for Anxiety

In his paper of 1943, A Theory of Human Motivation, psychologist Abraham Maslow proposed that healthy human beings had a certain number of needs, and that these needs are arranged in a hierarchy, with some needs (such as physiological and safety needs) being more primitive or basic than others (such as social and ego needs). Maslow’s so-called ‘hierarchy of needs’ is often presented as a five-level pyramid, with higher needs coming into focus only once lower, more basic needs have been met.

Maslow's Hierarchy of Needs

Maslow’s Hierarchy of Needs

Maslow called the bottom four levels of the pyramid ‘deficiency needs’ because we do not feel anything if they are met, but become anxious or distressed if they are not. Thus, physiological needs such as eating, drinking, and sleeping are deficiency needs, as are safety needs, social needs such as friendship and sexual intimacy, and ego needs such as self-esteem and recognition. On the other hand, he called the fifth, top level of the pyramid a ‘growth need’ because our need to self-actualize enables us to fulfill our true and highest potential as human beings.

Once we have met our deficiency needs, the focus of our anxiety shifts to self-actualization, and we begin, even if only at a sub- or semi-conscious level, to contemplate our bigger picture. However, only a small minority of people is able to self- actualize because self-actualization requires uncommon qualities such as honesty, independence, awareness, objectivity, creativity, and originality.

Maslow’s hierarchy of needs has been criticized for being overly schematic and lacking in scientific grounding, but it presents an intuitive and potentially useful theory of human motivation. After all, there is surely some truth in the popular saying that one cannot philosophize on an empty stomach, or in Aristotle’s observation that, ‘all paid work absorbs and degrades the mind’.

Many people who have met all their deficiency needs do not self-actualize, instead inventing more deficiency needs for themselves, because to contemplate the meaning of their life and of life in general would lead them to entertain the possibility of their meaninglessness and the prospect of their own death and annihilation.

A person who begins to contemplate his bigger picture may come to fear that life is meaningless and death inevitable, but at the same time cling on to the cherished belief that his life is eternal or important or at least significant. This gives rise to an inner conflict that is sometimes referred to as ‘existential anxiety’ or, more colourfully, ‘the trauma of non-being’.

While fear and anxiety and their pathological forms (such as agoraphobia, panic disorder, or PTSD) are grounded in threats to life, existential anxiety is rooted in the brevity and apparent meaninglessness or absurdity of life. Existential anxiety is so disturbing and unsettling that most people avoid it at all costs, constructing a false reality out of goals, ambitions, habits, customs, values, culture, and religion so as to deceive themselves that their lives are special and meaningful and that death is distant or delusory.

However, such self-deception comes at a heavy price. According to Jean-Paul Sartre, people who refuse to face up to ‘non-being’ are acting in ‘bad faith’, and living out a life that is inauthentic and unfulfilling. Facing up to non-being can bring insecurity, loneliness, responsibility, and consequently anxiety, but it can also bring a sense of calm, freedom, and even nobility. Far from being pathological, existential anxiety is a sign of health, strength, and courage, and a harbinger of bigger and better things to come.

For theologian Paul Tillich (1886-1965), refusing to face up to non-being leads not only to a life that is inauthentic but also to pathological (or neurotic) anxiety.

In The Courage to Be, Tillich asserts:

He who does not succeed in taking his anxiety courageously upon himself can succeed in avoiding the extreme situation of despair by escaping into neurosis. He still affirms himself but on a limited scale. Neurosis is the way of avoiding nonbeing by avoiding being.

According to this outlook, pathological anxiety, though seemingly grounded in threats to life, in fact arises from repressed existential anxiety, which itself arises from our uniquely human capacity for self-consciousness.

Facing up to non-being enables us to put our life into perspective, see it in its entirety, and thereby lend it a sense of direction and unity. If the ultimate source of anxiety is fear of the future, the future ends in death; and if the ultimate source of anxiety is uncertainty, death is the only certainty. It is only by facing up to death, accepting its inevitability, and integrating it into life that we can escape from the pettiness and paralysis of anxiety, and, in so doing, free ourselves to make the most out of our lives and out of ourselves.

The Death of Socrates, by Jacques-Louis David (detail).

The Death of Socrates, by Jacques-Louis David (detail).

Some philosophers have gone even further by asserting that the very purpose of life is none other than to prepare for death. In Plato’s Phaedo, Socrates, who is not long to die, tells the philosophers Simmias and Cebes that absolute justice, absolute beauty, or absolute good cannot be apprehended with the eyes or any other bodily organ, but only by the mind or soul. Therefore, the philosopher seeks in as far as possible to separate body from soul and become pure soul. As death is the complete separation of body and soul, the philosopher aims at death, and indeed can be said to be almost dead.

Adapted from the new edition of The Meaning of Madness

The Psychology of Self-Deception

A short, sharp look into some of our most important ego defenses.

In psychoanalytic theory, ego defenses are unconscious processes that we deploy to diffuse the fear and anxiety that arise when who we think we are or who we think we should be (our conscious ‘superego’) comes into conflict with who we really are (our unconscious ‘id’).

For instance, at an unconscious level a man may find himself attracted to another man, but at a conscious level he may find this attraction flatly unacceptable. To diffuse the anxiety that arises from this conflict, he may deploy one or several ego defenses. For example, (1) he might refuse to admit to himself that he is attracted to this man. Or (2) he might superficially adopt ideas and behaviours that are diametrically opposed to those of a stereotypical homosexual, such as going out for several pints with the lads, banging his fists on the counter, and peppering his speech with loud profanities. Or (3) he might transfer his attraction onto someone else and then berate him for being gay (young children can teach us much through playground retorts such as ‘mirror, mirror’ and ‘what you say is what you are’). In each case, the man has used a common ego defense, respectively, repression, reaction formation, and projection.

Repression can be thought of as ‘motivated forgetting’: the active, albeit unconscious, ‘forgetting’ of unacceptable drives, emotions, ideas, or memories. Repression is often confused with denial, which is the refusal to admit to certain unacceptable or unmanageable aspects of reality. Whereas repression relates to mental or internal stimuli, denial relates to external stimuli. That said, repression and denial often work together, and can be difficult to disentangle.

Repression can also be confused with distortion, which is the reshaping of reality to suit one’s inner needs. For instance, a person who has been beaten black and blue by his father no longer recalls these traumatic events (repression), and instead sees his father as a gentle and loving man (distortion). In this example, there is a clear sense of the distortion not only building upon but also reinforcing the repression.

Reaction formation is the superficial adoption—and, often, exaggeration—of emotions and impulses that are diametrically opposed to one’s own. A possible high-profile case of reaction formation is that of a particular US congressman, who, as chairman of the Missing and Exploited Children’s Caucus, introduced legislation to protect children from exploitation by adults over the Internet. The congressman resigned when it later emerged that he had been exchanging sexually explicit electronic messages with a teenage boy. Other, classic, examples of reaction formation include the alcoholic who extolls the virtues of abstinence and the rich student who attends and even organizes anti-capitalist rallies.

Projection is the attribution of one’s unacceptable thoughts and feelings to others. Like distortion, projection necessarily involves repression as a first step, since unacceptable thoughts and feelings need to be repudiated before they can be attributed to others. Classic examples of projection include the envious person who believes that everyone envies him, the covetous person who lives in constant fear of being dispossessed, and the person with fantasies of infidelity who suspects that his partner is cheating on him.

Just as common is splitting, which can be defined as the division or polarization of beliefs, actions, objects, or people into good and bad by selectively focusing on either their positive or negative attributes. This is often seen in politics, for instance, when left-wingers caricature right-wingers as selfish and narrow-minded, and right-wingers caricature left-wingers as irresponsible and self-serving hypocrites. Other classic examples of splitting are the religious zealot who divides people into blessed and damned, and the child of divorcees who idolizes one parent while shunning the other. Splitting diffuses the anxiety that arises from our inability to grasp a complex and nuanced state of affairs by simplifying and schematizing it so that it can more readily be processed or accepted.

Splitting also arises in groups, with people inside the group being seen in a positive light, and people outside the group in a negative light. Another phenomenon that occurs in groups is groupthink, which is not strictly speaking an ego defense, but which is so important as to be worthy of mention. Groupthink arises when members of a group unconsciously seek to minimize conflict by failing to critically test, analyse, and evaluate ideas. As a result, decisions reached by the group tend to be more irrational than those that would have been reached by any one member of the group acting alone. Even married couples can fall into groupthink, for instance, when they decide to take their holidays in places that neither wanted, but thought that the other wanted. Groupthink arises because members of a group are afraid both of criticizing and of being criticized, and also because of the hubristic sense of confidence and invulnerability that arises from being in a group. Philosopher Ludwig Wittgenstein once remarked, ‘It is a good thing that I did not let myself be influenced.’ In a similar vein, historian Edward Gibbon wrote that ‘…solitude is the school of genius … and the uniformity of a work denotes the hand of a single artist’. In short, a camel is a horse designed by a committee.

An ego defense similar to splitting is idealization. Like the positive end of splitting, idealization involves overestimating the positive attributes of a person, object, or idea while underestimating its negative attributes. More fundamentally, it involves the projection of our needs and desires onto that person, object, or idea. A paradigm of idealization is infatuation, when love is confused with the need to love, and the idealized person’s negative attributes are glossed over or even imagined as positive. Although this can make for a rude awakening, there are few better ways of relieving our existential anxiety than by manufacturing something that is ‘perfect’ for us, be it a piece of equipment, a place, country, person, or god.

If in love with someone inaccessible, it might be more convenient to intellectualize our love, perhaps by thinking of it in terms of idealization! In intellectualization, uncomfortable feelings associated with a problem are repressed by thinking about the problem in cold and abstract terms. I once received a phone call from a junior doctor in psychiatry in which he described a recent in-patient admission as ‘a 47-year-old mother of two who attempted to cessate her life as a result of being diagnosed with a metastatic mitotic lesion’. A formulation such as ‘…who tried to kill herself after being told that she is dying of cancer’ would have been better English, but all too effective at evoking the full horror of this poor lady’s predicament.

Intellectualization should not be confused with rationalization, which is the use of feeble but seemingly plausible arguments either to justify something that is painful to accept (‘sour grapes’) or to make it seem ‘not so bad after all’ (‘sweet lemons’). For instance, a person who has been rejected by a love interest convinces himself that she rejected him because she did not share in his ideal of happiness (sour grapes), and also that her rejection is a blessing in disguise in that it has freed him to find a more suitable partner (sweet lemons).

While no one can altogether avoid deploying ego defenses, some ego defenses are thought to be more ‘mature’ than others, not only because they involve some degree of insight, but also because they can be adaptive or useful. If a person is angry at his boss, he may go home and kick the dog, or he may instead go out and play a good game of tennis. The first instance (kicking the dog) is an example of displacement, the redirection of uncomfortable feelings towards someone or something less important, which is an immature ego defense. The second instance (playing a good game of tennis) is an example of sublimation, the channelling of uncomfortable feelings into socially condoned and often productive activities, which is a much more mature ego defense.

There are a number of mature ego defenses like sublimation that can be substituted for the more primitive ones. Altruism, for instance, can in some cases be a form of sublimation in which a person copes with his anxiety by stepping outside himself and helping others. By concentrating on the needs of others, people in altruistic vocations such as medicine or teaching may be able to permanently push their own needs into the background. Conversely, people who care for a disabled or elderly person may experience profound anxiety and distress when this role is suddenly removed from them.

Another mature ego defense is humour. By seeing the absurd or ridiculous aspect of an emotion, event, or situation, a person is able to put it into a less threatening context and thereby diffuse the anxiety that it gives rise to. In addition, he is able to share, and test, his insight with others in the benign and gratifying form of a joke. If man laughs so much, it is no doubt because he has the most developed unconscious in the animal kingdom. The things that people laugh about most are their errors and inadequacies; the difficult challenges that they face around personal identity, social standing, sexual relationships, and death; and incongruity, absurdity, and meaninglessness. These are all deeply human concerns: just as no one has ever seen a laughing dog, so no one has ever heard of a laughing god.

Further up the maturity scale is asceticism, which is the denial of the importance of that which most people fear or strive for, and so of the very grounds for anxiety and disappointment. If fear is, ultimately, for oneself, then the denial of the self removes the very grounds for fear. People in modern societies are more anxious than people in traditional or historical societies, no doubt because of the strong emphasis that modern societies place on the self as an independent and autonomous agent.

In the Hindu Bhagavad Gita, the god Krishna appears to Arjuna in the midst of the Battle of Kurukshetra, and advises him not to succumb to his scruples but to do his duty and fight on. In either case, all the men on the battlefield are one day condemned to die, as are all men. Their deaths are trivial, because the spirit in them, their human essence, does not depend on their particular incarnations for its continued existence. Krishna says, ‘When one sees eternity in things that pass away and infinity in finite things, then one has pure knowledge.’

There has never been a time when you and I have not existed, nor will there be a time when we will cease to exist … the wise are not deluded by these changes.

There are a great number of ego defenses, and the combinations and circumstances in which we use them reflect on our personality. Indeed, one could go so far as to argue that the self is nothing but the sum of its ego defenses, which are constantly shaping, upholding, protecting, and repairing it.

The self is like a cracked mask that is in constant need of being pieced together. But behind the mask there is nobody at home.

While we cannot entirely escape from ego defenses, we can gain some insight into how we use them. This self-knowledge, if we have the courage for it, can awaken us to ourselves, to others, and to the world around us, and free us to express our full potential as human beings.

The greatest oracle of the ancient world was the oracle at Delphi, and inscribed on the forecourt of the temple of Apollo at Delphi was a simple two-word command:

Γνῶθι σεαυτόν

Know thyself.

Adapted from Hide and Seek: The Psychology of Self-Deception.

Find Neel Burton on Twitter and Facebook.

Neel Burton
Source: Neel Burton

Can Personality Disorder be Good for You?

While personality disorders may lead to ‘severe impairment’, they may also lead to extraordinary achievement. A 2005 study by Board and Fritzon found that histrionic, narcissistic, and anankastic personality disorders are more common in high-level executives than in mentally disordered criminal offenders at the high security Broadmoor Hospital.

This suggests that people often benefit from non-normative and potentially maladaptive personality traits. For instance, people with histrionic personality disorder may be more adept at charming and cajoling others, and therefore at building and exercising professional relationships. People with narcissistic personality disorder may be highly ambitious, confident, and self-motivated, and able to employ people and situations to maximum advantage. And people with anankastic personality disorder may get quite far up their career ladder simply by being so devoted to work and productivity. Even people with borderline personality disorder may at times be bright, witty, and the life and soul of the party.

In their study, Board and Fritzon described the executives with a personality disorder as ‘successful psychopaths’ and the criminal offenders as ‘unsuccessful psychopaths’, and it may be that highly successful people and disturbed psychopaths have more in common than first meets the eye. As psychologist and philosopher William James put it, ‘When a superior intellect and a psychopathic temperament coalesce… in the same individual, we have the best possible condition for the kind of effective genius that gets into the biographical dictionaries.’

More recently, in 2010, Mullins-Sweatt and her colleagues carried out a study to uncover how successful psychopaths differ from unsuccessful ones. They asked a number of members of Division 41 (psychology and law) of the American Psychological Association, professors of clinical psychology, and criminal attorneys to first identify, and then to rate and describe, one of their acquaintances (if any) who could be counted as successful and who also conformed to psychologist Robert Hare’s definition of a psychopath:

…social predators who charm, manipulate and ruthlessly plow their way through life … Completely lacking in conscience and feeling for others, they selfishly take what they want and do as they please, violating social norms and expectations without the slightest sense of guilt or regret.

From the responses they collated, Mullins-Sweatt and her team found that successful psychopaths matched unsuccessful ones in all respects but one, namely, conscientiousness. So it seems that the key difference between unsuccessful and successful psychopaths is that the former behave impulsively and irresponsibly, whereas the latter are able to inhibit or restrain those destructive tendencies and build for the future.

Intelligence and conscientiousness are not enough to guarantee success, which also requires traits such as ambition, motivation, and people skills—traits that may be particularly pronounced when rooted in a personality disorder.

Personality disorders are generally thought to arise from a combination genetic factors and traumatic early life experiences such as parental loss and emotional, physical, and sexual abuse. People who have suffered childhood trauma may be left with intense feelings of despair, helplessness, and worthlessness. Later in life, they may seek out achievement and success to help compensate for these feelings. For instance, they may wish to be recognized by strangers because they were not recognized by their own parents, or they may wish to have control over others because they had none when they needed it most. The drive for achievement and success combined with the character traits and resilience that arise from loss and trauma may in later life propel them to the highest echelons of society.

This is borne out by a large study that looked at almost 700 eminent personalities, and found that 45 per cent had lost a parent before the age of 21. This ‘orphanhood effect’ seems particularly marked in creative people. One study looking specifically at a sample of authors found that 55 per cent had lost a parent before the age of 15. This suggests that disturbed psychopaths and creative visionaries do indeed share many features. While the former suffer from them, the latter are (also) able to put them to good use.

Broadly speaking, anyone’s personality can be said to lead to distress and impairment. For instance, a gregarious student is unable to isolate himself in the library and ends up failing his exams. A zealous company director loses his temper and regrets the damage that he has done to himself, others, and his company. An upstanding whistleblower ends up losing his job.

Everyone suffers for who he is, and, very often, our greatest strength is also the germ of our deepest suffering. While it is impossible to avoid such suffering, it is at least possible to value it for the personal growth that it can bring.

Like many blind figures in classical mythology, the prophet Teiresias could ‘see’ into himself. This self-knowledge enabled him not only to understand himself, but also to understand others and so to ‘see into the future’. Similarly, our suffering prompts us to look into ourselves. The self-knowledge this brings enables us not only to better regulate ourselves, but also to better appreciate others, the world, and our place within it. Thus, our suffering transforms our lives into a journey, a journey without an end, perhaps, but one that can also be seen as an end-in-itself. It is in this way that our suffering, or ‘impairment’, can bring deep meaning to our lives.

Adapted from the new edition of The Meaning of Madness.


Board BJ and Fritzon KF (2005): Disordered personalities at work. Psychology, Crime and Law 11:17-23.

James W (1902): The Varieties of Religious Experience, Lecture 1 ‘Religion and Neurology’, Footnote 6.

Mullins-Sweat S et al. (2010): The Search for the Successful Psychopath. Journal of Research in Personality 44:554-558.

Hare RD (1998): Without Conscience: The disturbing world of the psychopaths among us, opening lines. Guilford Press.

Psychiatry in Crisis

Is the medical model still helping?

In the UK, mental ill healthis recognized as the single largest cause of disability, contributing almost 23 per cent of the disease burden and costing over £100 billion ($157 billion) a year in services, lost productivity, and reduced quality of life. Every year in the EU, about 27 per cent of adults are affected by mental disorder of some kind. In the US, almost one in two people will meet the criteria for a mental disorder in the course of their lifetime. Data from the US National Health Interview Survey indicate that, in 2012, 13.5% of boys aged 3-17 had been diagnosed with attention deficit hyperactivity disorder (ADHD), up from just 8.3% in 1997.

There is no denying that a lot of people are suffering. But are they all really suffering from a mental disorder, that is, a medical illness, a biological disorder of the brain? And if not, are doctors, diagnoses, and drugs necessarily the best response to their problems?

Since 1952, the number of diagnosable mental disorders has burgeoned from 106 to over 300, and now includes such constructs as ‘gambling disorder’, ‘minor neurocognitive disorder’, ‘disruptive mood dysregulation disorder’, ‘premenstrual dysphoric disorder’, and ‘binge-eating disorder’.

According to a recent report, antidepressant prescriptions in England rose from 15 million items in 1998 to 40 million in 2012, this despite the mounting evidence for their ineffectiveness. Selective serotonin reuptake inhibitors (SSRIs) in particular have become something of a panacea, used not only to treat depression, but also to treat anxiety disorders, obsessive-compulsive disorder, and bulimia nervosa, and even some physical disorders such as premature ejaculation in young men and hot flushes in menopausal women. In the UK, the SSRI fluoxetine is so commonly prescribed that trace quantities have been detected in the water supply.

But despite all this apparent progress in diagnosis and treatment, people who meet the diagnostic criteria for such a paradigmatic mental disorder as schizophrenia tend to fare better in resource-poor countries, where human distress can take on very different forms and interpretations to those outlined in our scientifical classifications.

Psychiatry is in a crisis precipitated by its own success, and, assuming that it once did, the medical or biological model is no longer helping. The specialty of the heart is cardiology, the specialty of the digestive tract is gastroenterology, and the specialty of the brain is neurology and psychiatry. But neurology is not psychiatry, which literally means ‘healing of the soul’.

Some mental disorders undeniably have a strong biological basis, but even these have many more aspects and dimensions than ‘mere’ physical disorders.

It is high time to fundamentally rethink our approach to mental disorders and mental ‘dis-ease’.

The Second Edition of The Meaning of Madness, due out in September, is available for pre-order.

MoM 2e Cover

Laziness Vs Procrastination Vs Idleness

We are being lazy if we are able to carry out some activity that we ought to carry out, but are disinclined to do so on account of the effort involved. Instead, we remain idle, carry out the activity perfunctorily, or engage in some other less strenuous or boring activity. In short, we are being lazy if our motivation to spare ourselves effort trumps our motivation to do the right or best or expected thing—assuming, of course, that we know, or think that we know, what that is.

Synonyms for laziness include indolence and sloth. Indolence derives from the Latin indolentia, ‘without pain’ or ‘without taking trouble’. Sloth has more moral and spiritual overtones than either laziness or indolence. In the Christian tradition, sloth is one of the seven deadly sins (the other six being lust, gluttony, greed, wrath, envy, and pride) because it undermines society and God’s plan and invites all manner of sin. The Bible inveighs against slothfulness, notably in the Book of Ecclesiastes: ‘By much slothfulness the building decayeth; and through idleness of the hands the house droppeth through. A feast is made for laughter, and wine maketh merry: but money answereth all things.’

Laziness should not be confused with either procrastination or idleness. To procrastinate—from the Latin cras, ‘tomorrow’—is to postpone one task in favour of another or others which are perceived as being easier or more pleasurable but which are typically less important or urgent. To postpone a task for constructive or strategic purposes does not amount to procrastination. For a postponement to amount to procrastination, it has to represent poor or ineffective planning and result in a higher overall cost to the procrastinator, for example, in the form of stress, guilt, lost productivity, or lost opportunities. It is one thing to delay a tax return until all the numbers are in, but quite another to delay it so that it upsets our holiday plans and lands us with a fine. Both the lazybones and the procrastinator lack motivation, but unlike the lazybones the procrastinator aspires and intends to complete the task under consideration, and, moreover, eventually does complete it, albeit at a higher cost to himself.

To be idle is, not to be doing anything. Idleness is often romanticized, as epitomized by the Italian expression dolce far niente (‘it is sweet to do nothing’). Many people tell themselves that they work hard from a desire for idleness. But although our natural instinct is for idleness, most of us find prolonged idleness difficult to bear. Queuing for half an hour in a traffic jam can leave us feeling bored, restless, and irritable, and many motorists prefer to make a detour even if the alternative route is likely to take longer than sitting through the traffic. Recent research suggests that people will find the flimsiest excuse to keep busy, and that they feel happier for keeping busy even when their busyness is imposed upon them. In their research paper (Hsee CK et al. (2010), Idleness aversion and the need for justifiable busyness. Psychological Science 21(7): 926–930.), Christopher Hsee and his colleagues surmise that many of our purported goals may be little more than justifications for keeping busy.

We could be idle because we have nothing to do—or rather, because we lack the imagination to think of something to do. If we do evidently have something to do, we could be idle because we are lazy, but also because we are unable to do that thing, or because we have already done it and are resting and recuperating. Lastly, we could be idle because we value idleness or its products above whatever it is we have to do, which is not the same thing as being lazy. Lord Melbourne, Queen Victoria’s favourite prime minister, extolled the virtues of ‘masterful inactivity’. As chairman and CEO of General Electric, Jack Welch spent an hour each day in what he called ‘looking out of the window time’. Adepts of such strategic idleness use their ‘idle’ moments, among others, to gather inspiration, develop and maintain perspective, sidestep nonsense and pettiness, reduce inefficiency and half-living, and conserve health and stamina for truly important tasks and problems. ‘To do nothing at all,’ said Oscar Wilde, ‘is the most difficult thing in the world, the most difficult and the most intellectual.’

Adapted from Heaven and Hell: The Psychology of the Emotions.

Previous Older Entries


Get every new post delivered to your Inbox.

Join 254 other followers