What Is Love?

1920px-Sacrifice_of_Isaac-Caravaggio_(Uffizi)Love is a word with a meaning that has changed over time.

Today, we tend to think about love primarily in terms of romantic love.

But, if you consider it, the concept of romantic love barely features among the 66 books of the Bible. The two greatest “love” stories in the Bible are not of husband and wife, nor even of man and woman, but of man and man, and woman and woman: David and Jonathan, and Ruth and Naomi.

Instead, all love in the Bible is directed at God, and the love for the spouse, and more generally for the other, is subsumed under the love of God.

In the Sacrifice of Isaac (pictured), Abraham’s love for God trumps his love for his own son Isaac, whom he is willing to sacrifice for no other reason than that God commanded it.

In Ancient and medieval times, people did of course fall in love, but they did not believe that their love might in some sense save them, as we tend to today. When, in Homer’s Iliad, Helen eloped with Paris, setting off the Trojan War, neither she nor he conceived of their attraction as pure or noble or exalting.

Over the centuries, the sacred seeped out of God and into romantic love, which came to take the place of the waning religion in lending purpose to our lives. People had once loved God, but now they loved love: more than with their beloved, they fell in love with love itself.

Abraham had surrendered himself and Isaac out of love for God. But in the Romantic era, around the time of the American and French Revolutions, love grew into all the opposite: a means of finding and validating oneself, of lending weight and texture and solidity to one’s life—as encapsulated by Sylvester’s 1978 hit, You Make Me Feel (Mighty Real), the final kissing scene in Cinema Paradiso, and countless other popular songs and films.

In the time of God, “finding oneself”—or, more accurately, losing oneself in God—had demanded years of patient spiritual practice. But after the French Revolution, romantic love could come to the rescue of almost anyone, with very little effort or sacrifice on their part. Being saved became simply a matter of luck.

If love is a word with a meaning that has changed over time, it is also a word with several meanings, one that points at several, quite distinct, concepts with only a family resemblance between them.

Unlike us, the Ancient Greeks had several words for love, enabling them to distinguish more clearly between the different types. Eros, for example, referred to sexual or passionate love; philia to friendship; storge to familial love; and agape to universal love, such as the love for strangers, nature, or God.

As I show in my new book, The Secret to Everything, having more words for “love” enables us to think and talk about love in new and different ways. For instance, people in the early stages of a romantic relationship often expect unconditional storge, but find only the need and dependency of eros, and, if they are lucky, the maturity and fertility of philia. Given enough time, eros tends to mutate into storge.

But if we are to understand the deep meaning of the word “love”, then we need to uncover what all these different types of love share in common. In other words, what is it that unites erosphiliastorge, and agape?

What all these instances of love have in common, I think, is a reaching out beyond our own being to things that are able to lend weight and meaning to our lives, and, at the same time, an incorporation of those things into our being—whence the hug, the love bite, and the sacramental bread and wine of the Eucharist.

Love is the force of nature that enables us to cross the boundary between ourselves and the world, like the lobster, to shed our shell and grow beyond it—which is why people with little love end up being so small.

Positive Illusions Versus Depressive Realism

pink glassesThe Norwegian philosopher Peter Wessel Zapffe argued, essentially, that the human capacities for reason and self-awareness break with nature, giving us more than we, as a part of nature, can carry. So as not to go mad, ‘most people learn to save themselves by artificially limiting the content of consciousness.’

People not only limit the content of consciousness, but also fill it with less than the truth. In particular, most people think more highly of themselves than is warranted: they have an inflated sense of their qualities and abilities, an illusion of control over things that are mostly beyond them, and a misplaced optimism about their outcomes and prospects.

For example, most people claim to compare favourably to the average road user, citizen, parent… which is, of course, mathematically impossible, since not everyone can be above average. A couple on the verge of tying the knot is likely to overestimate the odds of having a carefree honeymoon or a gifted child, while underestimating the odds of having a miscarriage, falling ill, or getting divorced.

The concept of positive illusions first appeared in 1988, in a paper by Shelley Taylor and Jonathon Brown entitled, Illusion and well-being: A social psychological perspective on mental health. Still today, it is commonly believed that mental health corresponds to accurate perceptions of the self, the other, and the world, but in their paper Taylor and Brown argued that the evidence suggests otherwise, and that positive illusions are characteristic of normal human thought.

Positive illusions are helpful in so far as they enable us to take risks, invest in the future, and fend off despair and depression. After all, how many people would get married if they had any real sense of what awaited them? But in the longer term, the poor perspective and judgement that come from undue self-regard and false hope are likely to lead to disappointment and failure, to say nothing of the inhibitions and emotional disturbances (such as anger, anxiety, and so on) that can derive or descend from a defended position.

Positive illusions tend to be more common, and more marked, in the West. In East Asian cultures, for example, people are less vested in themselves and more vested in their community and society, and tend, if anything, to self-effacement rather than self-enhancement.

Positive illusions are also more prevalent in unskilled people, possibly because highly skilled people tend to assume, albeit falsely, that those around them enjoy similar levels of insight and competence. This Dunning-Kruger effect, as it has been called, is neatly encapsulated in a short fragment from the introduction to Darwin’s Descent of Man: ‘…ignorance more frequently begets confidence than does knowledge…’ And, of course, it may also be that, compared to highly skilled people, unskilled people are more reliant on positive illusions for their self-esteem and broader mental health.

Depressive realism

Just as it is commonly believed that mental health corresponds to accurate perceptions of the self, the other, and the world, so it is commonly believed that depression results in, or results from, distorted thinking.

‘Cognitive distortion’ is a concept from cognitive-behavioral therapy (CBT), developed by psychiatrist Aaron Beck in the 1960s and routinely used in the treatment of depression and other mental disorders. Cognitive distortion involves interpreting events and situations so that they conform to and reinforce our outlook or frame of mind, typically on the basis of very scant or partial evidence, or even no evidence at all.

Common cognitive distortions in depression include selective abstraction, personalization, and catastrophic thinking:

  • Selective abstraction is to focus on a single negative event or condition to the exclusion of other, more positive ones, for example, ‘My partner didn’t call me yesterday. He must hate me.’
  • Personalization is to relate independent events to oneself, for example, ‘The nurse is leaving her job because she’s fed up with me…’
  • Catastrophic thinking is to exaggerate the negative consequences of an event or situation, for example: ‘The pain in my knee is only going to get worse. When I’m reduced to a wheelchair, I won’t be able to go to work and pay the mortgage. So I’ll end up losing my house and dying in the street.’

However, the scientific literature suggests that, despite their propensity for such cognitive distortions, many people with depressed mood can also have more accurate judgement about the outcome of so-called contingent events (events that may or may not occur) and a more realistic perception of their role, abilities, and limitations—a phenomenon that is sometimes, and controversially, referred to as ‘depressive realism’.

The concept of depressive realism originated in 1979, in a paper entitled Judgment of contingency in depressed and nondepressed students: sadder but wiser? On the basis of their findings, the authors, Lauren Alloy and Lyn Abramson, argued that people with depression make more realistic inferences than ‘normal’ people, who are handicapped by their positive illusions. On the face of it, this suggests that people with depression are able to see the world more clearly for what it is, while normal people are only normal in so far as they are deceiving or deluding themselves.

This is a seductive proposition for someone like me, who has long been arguing that depression can be good for us—for example, in my book, The Meaning of Madness. But here’s the rub: people with depression are pessimistic even in situations in which pessimism is unwarranted, suggesting that, rather than being more realistic, their thinking is merely ‘differently biased’, and just as rigid and distorted as that of normal people with their positive illusions.

Wisdom, it seems, consists in being able to shed our positive illusions without also succumbing to depression, although, for many, depression may be a necessary step along the way.