Aha, Uh-oh and Doh: The Psychology of Insight

And how to improve cognitive flexibility.

‘Insight’ is sometimes used to mean something like ‘self-awareness’, including awareness of our thought processes, beliefs, desires, emotions, and so on, and how they might relate to truth or usefulness. Of course, self-awareness comes by degrees. Owing to chemical receptors in their tendrils, vining plants know not to coil around themselves, and in that much can be said to have awareness of self and not-self. Children begin to develop reflective self-awareness at around 18 months of age, enabling them to recognize themselves in pictures and mirrors.

But ‘insight’ is also used to mean something like ‘penetrating discernment’, especially in cases when a solution to a previously intractable problem suddenly presents itself—and it is on this particular meaning of the word that I now want to focus on.

Such ‘aha moments’, epitomized by Archimedes’ cry of Eureka! Eureka! (Gr., ‘I found it! I found it!’), involve seeing something familiar in a new light or context, particularly a brighter or broader one, leading to a novel perspective and positive emotions such as joy, enthusiasm, and confidence. It is said that, after stepping into his bath, Archimedes noticed the water level rising, and suddenly understood that the volume of water displaced corresponded to the volume of the part of his body that had been submerged. Lesser examples of aha moments include suddenly understanding a joke, or suddenly perceiving the other aspect of a reversal image such as the duck/rabbit optical illusion (pictured). Aha moments result primarily from unconscious and automatic processes, and we tend, when working on insight problems, to look away from sources of visual stimulus.

Aha moments ought to be distinguished from uh-oh moments, in which we suddenly become aware of an unforeseen problem, and from doh moments, popularized by Homer Simpson, when an unforeseen problem hits us and/or we have a flash of insight into our lack of insight.

‘Thinking out of the box’ is a significant cognitive achievement. Once we have understood something in one way, it is very difficult to see it in any other way, even in the face of strong contradictory evidence. In When Prophecy Fails (1956), Leon Festinger discussed his experience of infiltrating a UFO doomsday cult whose leader had prophesied the end of the world. When the end of the world predictably failed to materialize, most of the cult members dealt with the dissonance that arose from the cognitions ‘the leader said the world would end’ and ‘the world did not end’ not by abandoning the cult or its leader, as you might expect, but by introducing the rationalization that the world had been saved by the strength of their faith!

Very often, to see something in a different light also means to see ourselves and the whole world in that new light, which can threaten and undermine our sense of self. It is more a matter of the emotions than of reason, which explains why even leading scientists can struggle with perceptual shifts. According to the physicist Max Planck, “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Or to put it more pithily, science advances one funeral at a time.

Even worse, strong contradictory evidence, or attempts to convince us otherwise, can, in fact, be counterproductive and entrench our existing beliefs—which is why as a psychiatrist I rarely challenge my patients or indeed anyone directly. You don’t have to take my word for it: in one recent study, supplying ‘corrective information’ to people with serious concerns about the adverse effects of the flu jab actually made them less willing to receive it.

So, short of dissolving our egos like a zen master, what can we do to improve our cognitive flexibility? Of course, it helps to have the tools of thought, including language fluency and multiple frames of reference as given by knowledge and experience. But much more important is to develop that first sense of ‘insight’, namely, insight as self-awareness.

On a more day-to-day basis, we need to create the time and conditions for allowing new connections to form. My own associative thinking is much more active when I’m both well-rested and at rest, for example, standing under the shower or ambling in the park. As Chairman and CEO of General Electric, Jack Welch spent an hour each day in what he called ‘looking out of the window time’. August Kekulé claimed to have discovered the ring structure of the benzene molecule while daydreaming about a snake biting its own tail.

Time is a very strange thing, and not at all linear: sometimes, the best way of using it is to waste it.

Nyhan B & Reifler J (2015): Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine 33(3):459-464.
Advertisements

The Problems of Science

An overview of the philosophy of science

What is science? To call a thing ‘scientific’ or ‘scientifically proven’ is to lend that thing instant credibility. It is sometimes said that 90% of scientists who ever lived are alive today—despite a relative lack of scientific progress, and even regress as the planet comes under increasing strain. Especially in Northern Europe, more people believe in science than in religion, and attacking science can raise the same old, atavistic defences. In a bid to emulate or at least evoke the apparent success of physics, many areas of study have claimed the mantle of science: ‘economic science’, ‘political science’, ‘social science’, and so on. Whether or not these disciplines are true, bona fide sciences is a matter for debate, since there are no clear or reliable criteria for distinguishing a science from a non-science.

What might be said is that all sciences, unlike, say, magic or myth, share certain assumptions which underpin the scientific method, in particular, that there is an objective reality governed by uniform laws and that this reality can be discovered by systematic observation. A scientific experiment is basically a repeatable procedure designed to help support or refute a particular hypothesis about the nature of reality. Typically, it seeks to isolate the element under investigation by eliminating or ‘controlling for’ other variables that may be confused or ‘confounded’ with the element under investigation. Important assumptions or expectations include that: all potential confounding factors can be identified and controlled for, any measurements are appropriate and sensitive to the element under investigation, the results are analysed and interpreted rationally and impartially.

Still, many things can go wrong with the experiment. With, for example, drug trials, experiments that have not been adequately randomized (when subjects are randomly allocated to test and control groups) or adequately blinded (when information about the drug being administered/received is withheld from the investigator/subject) significantly exaggerate the benefits of treatment. Investigators may consciously or subconsciously withhold or ignore data that does not meet their desires or expectations (‘cherry picking’) or stray beyond their original hypothesis to look for chance or uncontrolled correlations (‘data dredging’). A promising result, which might have been obtained by chance, is much more likely to be published than an unfavourable one (‘publication bias’), creating the false impression that most studies have been positive and therefore that the drug is much more effective than it actually is. One damning systematic review found that, compared to independently funded drug trials, drug trials funded by pharmaceutical companies are less likely to be published, while those that are published are four timesmore likely to feature positive results for the products of their sponsors!

So much for the easy, superficial problems. But there are deeper, more intractable philosophical problems as well. For most of recorded history, ‘knowledge’ was based on authority, especially that of the Bible and whitebeards such as Aristotle, Ptolemy, and Galen. But today, or so we like to think, knowledge is much more secure because grounded in observation. Leaving aside that much of what counts as scientific knowledge cannot be directly observed, and that our species-specific senses are partial and limited, there is, in the phrase of Norwood Russell Hanson, ‘more to seeing than meets the eyeball’:

Seeing is an experience. A retinal reaction is only a physical state… People, not their eyes see. Cameras and eyeballs are blind.

Observation involves both perception and cognition, with sensory information filtered, interpreted, and even distorted by factors such as beliefs, experience, expectations, desires, and emotions. The finished product of observation is then encoded into a statement of fact consisting of linguistic symbols and concepts, each one with its own particular history, connotations, and limitations. All this means that it is impossible to test a hypothesis in isolation of all the background theories, frameworks, and assumptions from which it issues.

This is important, because science principally proceeds by induction, that is, by the observation of large and representative samples. But even if observation could be objective, observations alone, no matter how accurate and exhaustive, cannot in themselves establish the validity of a hypothesis. How do we know that ‘flamingos are pink’? Well, we don’t know for sure. We merely suppose that they are because, so far, every flamingo that we have seen or heard about has been pink. But the existence of a non-pink flamingo is not beyond the bounds of possibility. A turkey that is fed every morning might infer by induction that it will be fed every morning, until on Christmas Eve the goodly farmer picks it up and wrings its neck. Induction only ever yields probabilistic truths, and yet is the basis of everything that we know, or think that we know, about the world we live in. Our only justification for induction is that it has worked before, which is, of course, an inductive proof, tantamount to saying that induction works because induction works! For just this reason, induction has been called ‘the glory of science and the scandal of philosophy’.

It may be that science proceeds, not by induction, but by abduction or finding the most likely explanation for the observations—as, for example, when a physician is faced with a constellation of symptoms and formulates a ‘working diagnosis’ that more or less fits the clinical picture. But ultimately abduction is no more than a type of induction. Both abduction and induction are types of ‘backward reasoning’, formally equivalent to the logical fallacy of ‘affirming the consequent’:

  • If A then B. B. Therefore A.
  • “If I have the flu, then I have a fever. I have a fever. Therefore, I have the flu.”

But, of course, I could have meningitis or malaria or any number of other conditions. How to decide between them? At medical school, we were taught that ‘common things are common’. This is a formulation of Ockham’s razor, which involves choosing the simplest available explanation. Ockham’s razor, also called the law of parsimony, is often invoked as a principle of inductive reasoning, but, of course, the simplest available explanation is not necessarily the best or correct one. What’s more, we may be unable to decide which is the simplest explanation, or even what ‘simple’ might mean in context. Some people think that God is the simplest explanation for creation, while others think Him rather far-fetched. Still, there is some wisdom is Ockham’s razor: while the simplest explanation may not be the correct one, neither should we labour, or keep on ‘fixing’, a preferred hypothesis to save it from a simpler and better explanation. I should mention in passing that the psychological equivalent of Ockham’s razor is Hanlon’s razor: never attribute to malice that which can be adequately explained by neglect, incompetence, or stupidity.

Simpler hypotheses are also preferable in that they are easier to disprove, or falsify. To rescue it from the Problem of Induction, Karl Popper argued that science proceeds not inductively but deductively, by formulating a hypothesis and then seeking to falsify it.

  • ‘All flamingos are pink.’ Oh, but look, here’s a flamingo that’s not pink. Therefore, it is not the case that all flamingos are pink.

On this account, theories such as those of Freud and Marx are not scientific in so far as they cannot be falsified. But if Popper is correct that science proceeds by falsification, science could never tell us what is, but only ever what is not. Even if we did arrive at some truth, we could never know for sure that we had arrived. Another issue with falsification is that, when the hypothesis conflicts with the data, it could be the data rather than the hypothesis that is at fault—in which case it would be a mistake to reject the hypothesis. Scientists need to be dogmatic enough to persevere with a preferred hypothesis in the face of apparent falsifications, but not so dogmatic as to cling on to their preferred hypothesis in the face of robust and repeated falsifications. It’s a delicate balance to strike.

For Thomas Kuhn, scientific hypotheses are shaped and restricted by the worldview, or paradigm, within which the scientist operates. Most scientists are blind to the paradigm and unable to see across or beyond it. If data emerges that conflicts with the paradigm, it is usually ignored or explained away. But nothing lasts forever, and eventually the paradigm weakens and is overturned. Examples of such ‘paradigm shifts’ include the transition from Aristotelian mechanics to classical mechanics, the transition from miasma theory to the germ theory of disease, and the transition from ‘clinical judgement’ to evidence-based medicine. Of course, a paradigm does not die overnight. Reason is, for the most part, a tool that we use to justify what we are already inclined to believe, and a human life cannot easily accommodate more than one paradigm. In the words of Max Planck, ‘A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.’ Or to put it more pithily, science advances one funeral at a time.

In the Structure of Scientific Revolutions, Kuhn argued that rival paradigms offer competing and irreconcilable accounts of reality, suggesting that there are no independent standards by which they might judged against one another. Imre Lakatos sought to reconcile Popper and Kuhn, and spoke of programmes rather than paradigms. A programme is based on a hard core of theoretical assumptions accompanied by more modest auxiliary hypotheses formulated to protect the hard core against any conflicting data. While the hard core cannot be abandoned without jeopardising the programme, auxiliary hypotheses can be adapted to protect the hard core against evolving threats, rendering the hard core unfalsifiable. A progressive programme is one in which changes to auxiliary hypotheses lead to greater predictive power, whereas a degenerative programme is one in which these ad hocelaborations become sterile and cumbersome. A degenerative programme, says Lakatos, is one which is ripe for replacement. Though very successful in its time, classical mechanics, with Newton’s three laws of motion at its core, was gradually superseded by the special theory of relativity.

For Paul Feyerabend, Lakatos’s theory makes a mockery of any pretence at scientific rationality. Feyerabend went so far as to call Lakatos a ‘fellow anarchist’, albeit one in disguise. For Feyerabend, there is no such thing as a or the scientific method: anything goes, and as a form of knowledge science is no more privileged than magic, myth, or religion. More than that, science has come to occupy the same place in the human psyche as religion once did. Although science began as a liberating movement, it grew dogmatic and repressive, more of an ideology than a rational method that leads to ineluctable progress. In the words of Feyerabend:

Knowledge is not a series of self-consistent theories that converges toward an ideal view; it is rather an ever increasing ocean of mutually incompatible (and perhaps even incommensurable) alternatives, each single theory, each fairy tale, each myth that is part of the collection forcing the others into greater articulation and all of them contributing, via this process of competition, to the development of our consciousness.

‘My life’, wrote Feyerabend, ‘has been the result of accidents, not of goals and principles. My intellectual work forms only an insignificant part of it. Love and personal understanding are much more important. Leading intellectuals with their zeal for objectivity kill these personal elements. They are criminals, not the leaders of mankind.’

Every paradigm that has come and gone is now deemed to have been false, inaccurate, or incomplete, and it would be ignorant or arrogant to assume that our current ones might amount to the truth, the whole truth, and nothing but the truth. If our aim in doing science is to make predictions, enable effective technology, and in general promote successful outcomes, then this may not matter all that much, and we continue to use outdated or discredited theories such as Newton’s laws of motion so long as we find them useful. But it would help if we could be more realistic about science, and, at the same time, more rigorous, critical, and imaginative in conducting it.

Originally published on Psychology Today

The Limits of Reason

For Aristotle, our unique capacity to reason is what defines us as human beings. Therefore, our happiness, or our flourishing, consists in leading a life that enables us to use and develop our reason, and that is in accordance with reason.

Article 1 of the Universal Declaration of Human Rights (1948) states that all human beings are ‘endowed with reason’, and it has long been held that reason is something that God gave us, that we share with God, and that is the divine, immortal element in us.

At the dawn of the Age of Reason, Descartes doubted everything except his ability to reason. ‘Because reason’, he wrote, ‘is the only thing that makes us men, and distinguishes us from the beasts, I would prefer to believe that it exists, in its entirety, in each of us…’

But what is reason? Reason is more than mere associative thinking, more than the mere ability to move from one idea (such as storm clouds) to another (such as imminent rain). Associative thinking can result from processes other than reason, such as instinct, learning, or intuition. Reason, in contrast, involves providing reasons—ideally good reasons—for an association. It involves using a system of representation such as thought or language to derive or arrive at an association.

Reason is often amalgamated with logic, also known as formal logic or deductive reasoning. At the very least, logic is seen as the purest form of reason. Yes, logic is basically an attempt to codify the most reliable or fail-safe forms of reasoning. But logic, or at any rate modern logic, is concerned merely with the validity of arguments, with the right relationship between premises and conclusion. It is not concerned with the actual truth or falsity of the premises or the applicability of the conclusion. Reason, in contrast, is a much broader psychological activity which also involves assessing evidence, creating and testing hypotheses, weighing competing arguments, evaluating means and ends, developing and applying heuristics (mental shortcuts), and so on. All this requires the use of judgement, which is why reason, unlike logic, cannot be delegated to a computer, and also why it so often fails to persuade. Logic is but a tool of reason, and, in fact, it can be reasonable to accept something that is or appears to be illogical.

It is often thought, not least in educational establishments, that ‘logic’ is able to provide immediate certainty and the authority or credibility that goes with it. But logic is a lot more limited than many people imagine. Logic essentially consists in a set of operations for deriving a truth from other truths. In a sense, it merely makes explicit that which was previously implicit. It brings nothing new to the table. The conclusion merely flows from the premises as their inevitable consequence, for example:

  1. All birds have feathers. (Premise 1)
  2. Woodpeckers are birds. (Premise 2)
  3. Therefore, woodpeckers have feathers. (Conclusion)

Another issue with logic is that it relies on premises that are founded, not on logic itself, but on inductive reasoning. How do we know that ‘all birds have feathers’? Well, we don’t know for sure. We merely suppose that they do because, so far, every bird that we have seen or heard about has had feathers. But the existence of birds without feathers, if only in the fossil record, is not beyond the bounds of possibility. Many avian species are hatched naked, and a featherless bird called Rhea recently took the Internet by storm.

Inductive reasoning only ever yields probabilistic ‘truths’, and yet it is the basis of everything that we know or think that we know about the world we live in. Our only justification for induction is that it has worked in the past, which is, of course, an inductive proof, tantamount to saying that induction works because induction works! To rescue it from this Problem of Induction, Karl Popper argued that science proceeds not inductively but deductively, by making bold claims and then seeking to falsify those claims. But if Popper is right, science could never tell us what is, but only ever what is not. Even if we did arrive at some truth, we could never know for sure that we had arrived. And while our current paradigms may represent some improvement on the ones that went before, it would be either ignorant or arrogant to presume that they amounted to the truth, the whole truth, and nothing but the truth.

Putting these inductive/deductive worries aside, reason is limited in reach, if not in theory then at least in practice. The movement of a simple pendulum is regular and easy to predict, but the movement of a double pendulum (a pendulum with another pendulum attached to its end) is, as can be seen on YouTube, extremely chaotic. Similarly, the interaction between two physical bodies such as the sun and the earth can be reduced to a simple formula, but the interaction between three physical bodies is much more complex—which is why the length of the lunar month is not a constant. But even this so-called Three-Body Problem is as nothing compared to the entanglement of human affairs. God, it is sometimes said, gave all the easy problems to the physicists.

The intricacies of human affairs often lead to a paralysis of reason, and we are left undecided, sometimes for years or even into the grave. To cut through all this complexity, we rely heavily on forces such as emotions and desires—which is why Aristotle’s Rhetoric on the art of arguing includes a detailed dissection of what used to be called the passions. Our emotions and desires define the aims or goals of our reasoning. They determine the parameters of any particular deliberation and carry to conscious attention only a small selection of all the available facts and alternatives. Brain injured people with a diminished capacity for emotion find it especially hard to make decisions, as do people with apathy, which is a symptom of severe depression and other mental disorders. Relying so heavily on the emotions comes at a cost, which is, of course, that emotions aren’t rational and can distort reasoning. Fear alone can open the gate to all manner of self-deception. On the other hand, that emotions aren’t rational need not make them irrational. Some emotions are appropriate or justified, while others are not. This is why, as well as coming to grips with science, it is so important to educate our emotions.

Another shortcoming of reason is that it sometimes leads to unreasonable conclusions, or even contradicts itself. In On Generation and Corruption, Aristotle says that, while the opinions of certain thinkers appear to follow logically in dialectical discussion, ‘to believe them seems next door to madness when one considers the facts’. In Plato’s Lesser Hippias, Socrates manages to argue that people who commit injustice voluntarily are better than those who do it involuntarily, but then confesses that he sometimes thinks the opposite, and sometimes goes back and forth:

My present state of mind is due to our previous argument, which inclines me to believe that in general those who do wrong involuntarily are worse than those who do wrong voluntarily, and therefore I hope that you will be good to me, and not refuse to heal me; for you will do me a much greater benefit if you cure my soul of ignorance, than you would if you were to cure my body of disease.

The sophists of Classical Greece taught rhetoric to wealthy young men with ambitions of holding public office. Prominent sophists included Protagoras, Gorgias, Prodicus, Hippias, Thrasymachus, Callicles, and Euthydemus, all of whom feature as characters in Plato’s dialogues. Protagoras charged extortionate fees for his services. He once took on a pupil, Euathlus, on the understanding that he would be paid once Euathlus had won his first court case. However, Euathlus never won a case, and eventually Protagoras sued him for non-payment. Protagoras argued that if he won the case he would be paid, and if Euathlus won the case, he still would be paid, because Euathlus would have won a case. Eualthus retorted that if he won the case he would not have to pay, and if Protagoras won the case, he still would not have to pay, because he still would not have won a case!

Whereas philosophers such as Plato use reason to arrive at the truth, sophists such as Protagoras abuse reason to move mobs and enrich themselves. But we are, after all, social animals, and reason evolved more as a means of solving practical problems and influencing people than as a ladder to abstract truths. What’s more, reason is not a solitary but a collective enterprise: premises are at least partially reliant on the achievements of others, and we ourselves make much better progress when prompted and challenged by our peers. The principal theme of Plato’s Protagoras is the teachability of virtue. At the end of the dialogue, Socrates remarks that he began by arguing that virtue cannot be taught, but ended by arguing that virtue is no other than knowledge, and therefore that it can be taught. In contrast, Protagoras began by arguing that virtue can be taught, but ended by arguing that some forms of virtue are not knowledge, and therefore that they cannot be taught! Had they not debated, both men would have stuck with their original, crude opinions and been no better off.

Why does reason say ridiculous things and contradict itself? Perhaps the biggest problem is with language. Words and sentences can be vague or ambiguous. If you remove a single grain from a heap of sand, it is still a heap of sand. But what happens if you keep on repeating the process? Is a single remaining grain still a heap? If not, at what point did the heap go from being a heap to a non-heap? When the wine critic Jancis Robinson asked on Twitter what qualifies someone to call themselves a sommelier, she received a least a dozen different responses. Another big problem is with the way we are. Our senses are crude and limited. More subtly, our minds come with built-in notions that may have served our species well but do not accurately or even approximately reflect reality. Zeno’s paradoxes, for example, flush out the limits of our understanding of something as rudimentary as movement. Some of Zeno’s paradoxes side with quantum theory in suggesting that space and time are discrete, while others side with the theory of relativity in suggesting that they are continuous. As far as I know (I am not a physicist), quantum theory and the theory of relativity remain unreconciled. Other concepts, such as infinity or what lies outside the universe, are simply beyond our ability to conceive. A final sticking point is with self-referential statements, such as “This statement is false.” If the statement is false, it is true; but if it is true, it is not false.

In concluding, I want to make it very clear that I hold reason in the highest regard. It is, after all, the foundation of our peace and freedom, which are under constant threat from the forces of unreason. In highlighting its limits, I seek not to disparage or undermine it but to understand and use it better, even to exalt it.

‘The last function of reason’, said Blaise Pascal, ‘is to recognize that there is an infinity of things which are beyond it. It is but feeble if it does not see so far as to know this.’

Hide & Seek Out Today!

hide & seek 2e

The new edition of Hide and Seek is now available on Kindle!

Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others. And having no respect he ceases to love…  —Fyodor Dostoevsky

Self-deception is common and universal, and the cause of most human tragedies. Of course, the science of self-deception can help us to live better and get more out of life. But it can also cast a murky light on human nature and the human condition, for example, on such exclusively human phenomena as anger, depression, fear, pity, pride, dream making, love making, and god making, not to forget age-old philosophical problems such as selfhood, virtue, happiness, and the good life. Nothing, in the end, could possibly be more important.

Link for the UK

Link for the US

 

The Psychology and Philosophy of Anger

Anger is a common and potentially destructive emotion that turns many a human life into a living hell. It’s hard to imagine a truly wise person like the Dalai Lama ever losing his temper. By a careful meditation, we can learn to control our anger and maybe even banish it entirely from our lives.

The philosopher Aristotle discusses anger at great length. In the Nicomachean Ethics, he says that a good-tempered person can sometimes get angry, but only as he ought to. Such a person, he continues, might get angry too soon or not enough, yet still be praised for being good-tempered. It is only if he deviates more markedly from the mean with respect to anger that he becomes blameworthy, either ‘irascible’ at one extreme or ‘lacking in spirit’ at the other.

For in everything it is no easy task to find the middle … anyone can get angry—that is easy—or give or spend money; but to do this to the right person, to the right extent, at the right time, with the right motive, and in the right way, that is not for everyone, nor is it easy; wherefore goodness is both rare and laudable and noble.

In the Rhetoric, Aristotle defines anger as an impulse, accompanied by pain, to a conspicuous revenge for a conspicuous slight that has been directed either at the person himself or at his friends. He adds that the pain of anger can be accompanied by pleasure arising from the expectation of revenge. I’m not so sure. Even if anger does contain a part of pleasure, this a very thin kind of pleasure, akin to whatever ‘pleasure’ I might derive from saying “if you ruin my day, I’ll ruin yours” or “look how big I think I am”.

A person, says Aristotle, can be slighted out of one of three things: contempt, spite, and insolence. In either case, the slight betrays the offender’s feelings that the slighted person is obviously of no importance. The slighted person may or may not get angry but is more likely to do so if he is in distress—for example, in poverty or in love—or if he feels insecure about the subject of the slight or about himself in general.

On the other hand, he is less likely to get angry if the slight is involuntary, unintentional, or itself provoked by anger, or if the offender apologies or humbles himself before him and behaves like his inferior. Even dogs, says Aristotle, do not bite sitting people. The slighted person is also less likely to get angry if the offender has done him more kindnesses than he has returned, or obviously respects him, or is feared or admired by him.

Once provoked, anger can be quelled by the feeling that the slight is deserved, by the passage of time, by the exaction of revenge, by the suffering of the offender, or by being redirected onto a third person. Thus, although angrier at Ergophilius than Callisthenes, the people acquitted Ergophilius because they had already condemned Callisthenes to death. Writing two thousand years before the birth of psychoanalysis, Aristotle seems to have put his finger on the ego defence of displacement, with the people’s anger for Ergophilius ‘displaced’ onto Callisthenes.

There is a clear sense in which Aristotle is correct in speaking of such a thing as right or proper anger. Anger can serve a number of useful, even vital, functions. It can put an end to a bodily, emotional, or social threat, or, failing that, it can mobilize mental and physical resources for defensive or restitutive action. If judiciously exercised, it can enable a person to signal high social status, compete for rank and position, ensure that contracts and promises are fulfilled, and even inspire positive feelings such as respect and sympathy. A person who is able to exercise anger judiciously is likely to feel better about himself, more in control, more optimistic, and more prone to the sort of risk taking that promotes successful outcomes.

On the other hand, anger, and especially unconstrained anger, can lead to poor perspective and judgement, impulsive and destructive behaviour, and loss of standing and goodwill. So, it appears that the sort of anger that is justified, strategic, and adaptive ought to be distinguished from a second type of anger (let us call it ‘rage’) that is uncalled for, unprocessed, irrational, indiscriminate, and uncontrolled. The function of rage is simply to protect a threatened ego, replacing or masking one kind of pain with another.

But even right or proportionate anger is unhelpful in so far as it is anger, which is both painful and harmful, and harmful because it involves a loss of perspective and judgement. Here’s an example. Anger, and especially rage, strengthens correspondence bias, that is, the tendency to attribute observed behaviours to dispositional (or personality-related) factors rather than situational factors. For instance, if I forget to do the dishes, I am under the impression that this is because I have been busy and suddenly felt very tired (situational factors); but if Emma forgets to do the dishes, I am under the impression that this is because she is lazy or irresponsible or maybe even vindictive (dispositional factors).

More fundamentally, anger reinforces the illusion that people exercise a high degree of free will, whereas in actual fact most of a person’s actions and the brain activity that they correspond to are determined by past events and the cumulative effects of those past events on that person’s patterns of thinking and behaving. Emma is Emma because she is Emma, and, at least in the short-term, there is precious little that she can do about that. It follows that the only person who can truly deserve our anger is the one who acted freely, that is, the one who spited us freely and therefore probably rightly! Anger is a vicious circle: it arises from poor perspective and makes it much poorer still.

This does not mean that anger is never justified, as a display of anger—even if undeserved—can still serve a benevolent strategic purpose, as when we pretend to get angry at a child for the benefit of shaping his or her character. But if all that is ever required is a calculated display of anger, then true anger that involves real pain is entirely superfluous, its presence serving merely to betray… a certain lack of understanding.

The world is as it is and always has been: raging against it is hardly going to make anything better. And it is by truly and permanently understanding this that we can banish real, painful, and destructive anger from our lives. But this, of course, assumes that we can accept the world for what it is.

What Is Intelligence?

There is no agreed definition or model of intelligence. By the Collins English Dictionary, it is ‘the ability to think, reason, and understand instead of doing things automatically or by instinct’. By the Macmillan Dictionary, it is ‘the ability to understand and think about things, and to gain and use knowledge’.

In seeking to define intelligence, a good place to start might be with dementia. In Alzheimer’s disease, the most common form of dementia, there is disturbance of multiple higher cortical functions including memory, thinking, orientation, comprehension, calculation, learning capacity, language, and judgement. I think it significant that people with dementia or severe learning difficulties cope very poorly with changes in their environment, such as moving into a care home or even into an adjacent room. Taken together, this suggests that intelligence refers to the functioning of a number of related faculties and abilities that enable us to respond to environmental pressures to avoid danger and distress. Because this is not beyond animals and even plants, they too can be said to be possessed of intelligence.

We Westerners tend to think of intelligence primarily in terms of analytical skills. But in a close-knit hunter-gatherer society, intelligence might be defined more in terms of foraging skills, or social skills or responsibilities. Even within a single society, the skills that are most valued change over time. In the West, the emphasis has gradually shifted from language skills to analytical skills, and it is only in 1960, well within living memory, that the Universities of Oxford and Cambridge dropped Latin as an entry requirement. In 1990, Peter Salovey and John D. Mayer published the seminal paper on emotional intelligence, and E.I. soon became all the rage. In that same year, 1990, Tim Berners-Lee wrote the first web browser. Today, we cannot go very far without having some considerable I.T. skills (certainly by the standards of 1990), and computer scientists are among some of the most highly paid professionals. All this to say that what constitutes intelligence varies according to the needs and values of our culture and society.

Our society holds analytical skills in such high regard that some of our leaders repeatedly mention their ‘high I.Q.’ to lend themselves credibility. This Western emphasis on reason and intelligence has its roots in Ancient Greece with Socrates, his pupil Plato, and Plato’s pupil Aristotle. Socrates held that ‘the unexamined life is not worth living’. He typically proceeded by questioning one or more people about a certain concept such as courage or justice, eventually exposing a contradiction in their initial assumptions and provoking a reappraisal of the concept. For Plato, reason could carry us far beyond the confines of common sense and everyday experience into a ‘hyper-heaven’ of ideal forms. He famously fantasized about putting a geniocracy of philosopher-kings in charge of his utopic Republic. Finally, Aristotle argued that our distinctive function as human beings is our unique capacity to reason, and therefore that our supreme good and happiness consists in leading a life of rational contemplation. To paraphrase Aristotle in Book X of the Nicomachean Ethics, ‘man more than anything is reason, and the life of reason is the most self-sufficient, the most pleasant, the happiest, the best, and the most divine of all.’ In later centuries, reason became a divine property, found in man because made in God’s image. If you struggled with your SATs, or thought they were pants, you now know who to blame.

Unfortunately, the West’s obsession with analytical intelligence has had, and continues to have, dire moral and political consequences. Immanuel Kant most memorably made the connection between reasoning and moral standing, arguing (in simple terms) that by virtue of their ability to reason human beings ought to be treated, not as means to an end, but as ends-in-themselves. From here, it is all too easy to conclude that, the better you are at reasoning, the worthier you are of personhood and its rights and privileges. For centuries, women were deemed to be ’emotional’, that is, less rational, which justified treating them as chattel or, at best, second-class citizens. The same could be said of non-white people, over whom it was not just the right but the duty of the white man to rule. Kipling’s poem The White Man’s Burden (1902) begins with the lines: Take up the White Man’s burden/ Send forth the best ye breed/ Go bind your sons to exile/ To serve your captives’ need/ To wait in heavy harness/ On fluttered folk and wild/ Your new-caught, sullen peoples/ Half-devil and half-child. People deemed to be less rational—women, non-white people, the lower classes, the infirm—were not just disenfranchised but dominated, colonized, enslaved, murdered, in all impunity. Only in 2015 did the U.S. Senate vote to compensate living victims of government-sponsored sterilization programs for the ‘feeble-minded’. Today, it is the white man who most fears artificial intelligence, imagining that it will usurp his status and privilege.

According to one recent paper, I.Q. is the best predictor of job performance. But this is not entirely surprising given that ‘performance’ and I.Q. have been defined in similar terms, and that both depend, to some extent, on third factors such as compliance, motivation, and educational attainment. Rather than intelligence per se, genius is defined more by drive, vision, creativity, and opportunity, and it is notable that the minimum I.Q. necessary for genius—probably around 125—is not all that high.

William Shockley and Luis Walter Alvarez, who both went on to win the Nobel Prize for physics, were excluded from the Terman Study of the Gifted on account of… their modest I.Q. scores.

For the story, in later life Shockley developed controversial views on race and eugenics, setting off a national debate over the use and applicability of I.Q. tests.

References

  • Salovey P & Mayer JD (1990): Emotional intelligence. Imagination, Cognition and Personality 9(3):185–211.
  • Rees MJ &  Earles JA (1992): Intelligence is the Best Predictor of Job Performance. Current Directions in Psychological Science 1(3): 86-89.
  • Saxon W (1989): Obituary William B. Shockley, 79, Creator of Transistor and Theory on Race. New York Times, August 14, 1989.

The Secrets of Inspiration

poseidonThink back to your favourite teacher: for me, a French teacher who wept as he read out from a novel by Marguerite Duras. The teachers whom we hold in our hearts are not those who taught us the most facts, but those who inspired us and opened us up to ourselves. But what is inspiration and can it be cultivated?

The word ‘inspiration’ ultimately derives from the Greek for ‘God-breathed’, or ‘divinely breathed into’. In Greek myth, inspiration is a gift of the muses, the nine daughters of Zeus and Mnemosyne (‘Memory’), though it can also come from Apollo (Apollon Mousagetēs, ‘Apollo Muse-leader’), Dionysus, or Aphrodite. Homer famously invokes the muses in the very first line of the Iliad: ‘Sing, O Muse, of the rage of Achilles, son of Peleus, that brought countless ills upon the Achaeans…’

Similarly, the Church maintains that inspiration is a gift from the Holy Ghost, including the inspiration for the Bible itself: ‘For the prophecy came not in old time by the will of man: but holy men of God spake as they were moved by the Holy Ghost’ (2 Peter 1:21).

The Oxford English Dictionary defines ‘inspiration’ as ‘a breathing in or infusion of some idea, purpose, etc. into the mind; the suggestion, awakening, or creation of some feeling or impulse, especially of an exalted kind’. Going with this, there appears to be two aspects to inspiration: some kind of vision, accompanied by some kind of positive energy with which to drive or at least sustain that vision.

‘Inspiration’ is often confused with ‘motivation’ and ‘creativity’. Motivation aims at some sort of external reward, whereas inspiration comes from within and is very much its own reward. Although inspiration is associated with creative insight, creativity also involves the realization of that insight—which requires opportunity, means, and, above all, effort. In the words of Thomas Edison, genius is one percent inspiration, ninety-nine percent perspiration—although you may not get started, or get very far, without the initial one percent.

Other than creativity, inspiration has been linked with enthusiasm, optimism, and self-esteem. Inspiration need not be all artistic and highfalutin: I often feel inspired to garden or cook, to plant out some bulbs for next spring or make use of some seasonal ingredients. Such inspired tasks feel very different from, say, writing a complaint or filing my accounts. If I could be paid to do what inspires me, and pay others to do what doesn’t, I should be a very happy man.

Despite its importance to both society and the individual, our system of education leaves very little place for inspiration—perhaps because, like wisdom and virtue, it cannot easily be taught but only… inspired. Unfortunately, if someone has never been inspired, he or she is unlikely to inspire others. That is a great shame. The best education consists not in being taught but in being inspired, and, if I could, I would rather inspire a single person than teach a thousand.

But where, in the first place, does inspiration come from? In Plato’s Ion, Socrates likens inspiration to a divine power, and this divine power to a magnetic stone that can not only move iron rings, but also magnetize the iron rings so that they can do the same. This leads to a long chain of iron rings, with each ring’s energy ultimately derived from that of the original magnetic stone. If a poet is any good, this is not because he has mastered his subject, but because he is divinely inspired, divinely possessed:

For the poet is a light and winged and holy thing, and there is no invention in him until he has been inspired and is out of his senses, and the mind is no longer in him: when he has not attained to this state, he is powerless and is unable to utter his oracles.

Socrates compares inspired poets to the Bacchic maidens, who are out of their minds when they draw honey and milk from the rivers. He asks Ion, a rhapsode (reciter of poetry), whether, when he recites Homer, he does not get beside himself, whether his soul does not believe that it is witnessing the actions of which he sings. Ion replies that, when he sings of something sad, his eyes are full of tears, and when he sings of something frightening, his hairs stand on end, such that he is no longer in his right mind. Socrates says that this is precisely the effect that a rhapsode has on his audience: the muse inspires the poet, the poet the rhapsode, and the rhapsode his audience, which is the last of the iron rings in the divine chain.

In Plato’s Phaedrus, Socrates argues that madness, as well as being an illness, can be the source of our greatest blessings. There are, he continues, four kinds of inspired madness: prophecy, from Apollo; holy prayers and mystic rites, from Dionysus; poetry, from the muses; and love, from Aphrodite and Eros.

But if a man comes to the door of poetry untouched by the madness of the muses, believing that technique alone will make him a good poet, he and his sane companions never reach perfection, but are utterly eclipsed by the performances of the inspired madman.

All human beings, says Socrates, are able to recollect universals such as perfect goodness and perfect beauty, and must therefore have seen them in some other life or other world. The souls that came closest to the universals, or that experienced them most deeply, are reincarnated into philosophers, artists, and true lovers. As the universals are still present in their minds, they are completely absorbed in ideas about them and forget all about earthly interests. Humdrum people think that they are mad, but the truth is that they are divinely inspired and in love with goodness and beauty. In the 20th century, the psychoanalyst Carl Jung echoed Plato, arguing that the artist is one who can reach beyond individual experience to access our genetic memory, that is, the memory, such as the memory for language, that is already present at birth. It is perhaps no coincidence that, in Greek myth, the mother of the muses is Mnemosyne/Memory.

The idea that ‘madness’ is closely allied with inspiration and revelation is an old and recurring one. In Of Peace of Mind, Seneca the Younger writes that ‘there is no great genius without a tincture of madness’ (nullum magnum ingenium sine mixtuae dementiae fuit), a maxim which he attributes to Aristotle, and which is also echoed in Cicero. For Shakespeare, ‘the lunatic, the lover, and the poet are of imagination all compact’. And for Dryden, ‘great wits are sure to madness near allied, and thin partitions do their bounds divide’. As I argued in a book called The Meaning of Madness, our reservoir of madness is a precious resource that we can learn to tap into.

For the modern writer André Gide,

The most beautiful things are those that are whispered by madness and written down by reason. We must steer a course between the two, close to madness in our dreams, but close to reason in our writing.

7 simple strategies to encourage inspiration

So it seems that inspiration is some kind of alignment or channelling of primal energies, and that it cannot quite be summoned or relied upon.

Nonetheless, here are seven simple strategies that may make it more likely to alight upon us:

1. Wake up when your body tells you to. No one has ever been tired and inspired at the same time. To make matters worse, having our sleep disrupted by an alarm clock or other extraneous stimulus can leave us feeling groggy and grouchy, as though we had ‘woken up on the wrong side of the bed’.

2. Complete your dreams. REM sleep, which is associated with dreaming, is richest just before *natural* awakening. Dreaming serves a number of critical functions such as assimilating experiences, processing emotions, and enhancing problem solving and creativity. In fact, the brain can be more active during REM sleep than during wakefulness. Many great works of art have been inspired by dreams, including Dali’s Persistence of Memory, several of Edgar Allan Poe’s poems and short stories, and Paul McCartney’s Let it Be.

3. Eliminate distractions, especially the tedious ones. Clear your diary, remove yourself from people, take plenty of time over every small thing. You want to give your mind plenty of spare capacity. You want it to roam, to freewheel. Before going to bed, I check my calendar for the next day’s engagements, and am never happier than when I see ‘No Events’. Don’t worry or feel guilty, the sun won’t fall out of the sky. Many people are unable to let their minds wander for fear that uncomfortable thoughts and feelings might arise into their consciousness. If they do, why not take the opportunity to meet them?

4. Don’t try to rush or force things. If you try to force inspiration, you will strangle it and achieve much less overall. There may be ‘on’ days and ‘off’ days, or even ‘on’ hours and ‘off’ hours. If you don’t feel inspired, that’s fine, go out and enjoy yourself. Your boss may disagree, but it’s probably the most productive thing you could do. If you can, try not to have a boss.

5. Be curious. The 17th century philosopher John Locke suggested that inspiration amounts to a somewhat random association of ideas and sudden unison of thought. If something, anything, catches your interest, try to follow it through. Nothing is too small or irrelevant. Read books, watch documentaries, visit museums and exhibitions, walk in gardens and nature, talk to inspired and inspiring people… Feed your unconscious.

6. Break the routine. Sometimes it can really help to give the mind a bit of a shake. Try new things that take you out of your comfort zone. Modify your routine or your surroundings. Better still, go travelling, especially to places that are unfamiliar and disorienting, such as a temple in India or a hippy farm in the Uruguayan pampas.

7. Make a start. When I write an article, I make a start and come back to it whenever I next feel inspired. The minute I start flagging, I stop and do something else, and, hopefully, while I do that, the next paragraph or section enters my mind. Some articles I write over three or four days, others over three or four weeks—but hardly ever in a single day or single sitting. When I write a book, the first half seems to take forever, while the second half gets completed in a fraction of the time. Small accomplishments are important because they boost confidence and free the mind to move on, establishing a kind of creative momentum.

If you have any other thoughts on inspiration, please put them in the comments section.

Previous Older Entries

%d bloggers like this: