How Does the Language You Speak Influence the Way You Think?

african boy

Silence is the language of God, all else is poor translation. —Rumi

The ostensible purpose of language is to transmit thoughts from one mind to another. Language represents thought, but does it also determine thought?

Wittgenstein famously said that “the limits of my language stand for the limits of my world.” Taken at face value, that seems too strong a claim. There are over 7,000 languages in the world—with, by some estimates, one dying out each and every week. The number of basic colour terms varies quite considerably from one language to another. Dani, spoken in New Guinea, and Bassa, spoken in Liberia and Sierra Leone, each have no more than two colour terms, one for dark/cool colours and the other for light/warm colours. But, obviously, speakers of Dani and Bassa are able to perceive, and think about, more than two colours.

More subtly, there is no English equivalent for the German word Sehnsucht, which refers to the dissatisfaction with an imperfect reality paired with the yearning for an ideal that comes to seem more real than the reality itself. But despite lacking the word, Walt Whitman could clearly conjure the concept and emotion: Is it a dream? Nay, but the lack of it the dream, And, failing it, life’s lore and wealth a dream, And all the world a dream.

The English language has a word for children who have lost their parents (orphan), and a word for spouses who have lost their spouse (widow or widower), but no word for parents who have lost a child. This may mean that parents who have lost a child are less likely to enter our minds, but not that they cannot enter our minds, or that we cannot conceive of them. We often think about or remember things that cannot be put into words, such as the smell and taste of a mango, the dawn chorus of the birds, or the contours of a lover’s face. Animals and pre-linguistic babies must surely have thoughts, even if they have no language.

Time heaved a gentle sigh as the wind swept through the willows. Communication does not require language, and many animals communicate effectively by other modes. However, language is closely associated with symbolism, and so with emotionalism and conceptual thought and creativity. These unique assets make us by far the most adaptable of all animals, and enable us to engage in highly abstract pursuits such as art, science, and philosophy that define us as human beings.

Imagine what it would be like to live without language—not without the ability to speak, but without an actual language. Given the choice, would you rather lose the faculty for sight or the faculty for language? This is probably the first time that you are faced with this question—the faculty for language is so fundamental to the human condition that, unlike the faculty for sight, we take it completely for granted. “Monkeys,” quipped Kenneth Grahame, “very sensibly refrain from speech, lest they should be set to earn their livings.”

If language does not determine thought, how, if at all, does it interact with thought? Or, to put it another way, how does the language you speak influence the way you think? Russian, Greek, and many other languages have two words for blue, one for lighter shades, the other for darker shades—goluboy and siniy in Russian, and ghalazio and ble in Greek. A study found that, compared to English speakers, Russian speakers were quicker to discriminate between shades of goluboy and siniy, but not shades of goluboy or shades of siniy. Conversely, another study found that Greek speakers who had lived for a long time in the U.K. see ghalazio and ble as more similar than Greek speakers living in Greece. By creating categories, language can enhance cognition.

In contrast to modern Greek, Ancient Greek, in common with many ancient languages, has no specific word for blue, leaving Homer to speak about “the wine-dark sea.” But the Ancient Greeks did have several words for ‘love’, including philia, eros, storge, and agape, each one referring to a different type or concept of love. This means that they could speak more precisely about love, but does it also mean that they could think more precisely about love, and, as a result, have more fulfilled love lives? Or perhaps the Greeks had more words for love because they had more fulfilled love lives in the first place, or, more prosaically, because their culture and society placed more emphasis on the different bonds that can exist between people, and on the various duties and expectations that attend, or attended, to those bonds.

Philosophers and academics sometimes make up words to help them talk and think about an issue. In the Phaedrus, Plato coined the word psychagogia, the art of leading souls, to characterize rhetoric—another word that he invented. Every field of human endeavour inevitably evolves its own specialized jargon. There seems to be an important relationship between language and thinking: I often speak—and write, as I am doing now—to define or refine my thinking on a particular topic, and language is the scaffolding by which I arrive at my more subtle or syncretic thoughts.

While we’re talking dead languages, it may come as a surprise that Latin has no direct translations for yes and no. Instead, one either echoes the verb in the question (in affirmative or negative) or expresses one’s feelings about the truth value of the proposition with adverbs such as certe, fortasse, nimirum, plane, vero, etiam, sane, minime…This may have led to more nuanced thinking, as well as greater interpersonal engagement, though it must have been a nightmare for teens.

Much of the particularity of a language is extra-lexical, built into the syntax and grammar of the language and virtually invisible to native speakers. English, for example, restricts the use of the present perfect tense (“has been,” “has read”) to subjects who are still alive, marking a sharp grammatical divide between the living and the dead, and, by extension, between life and death. But of course, as an English speaker, you already knew that, at least unconsciously.

Here’s another, more substantial, example: When describing accidental events, English speakers tend to emphasize the agent (“I fired the gun”) more than, say, speakers of Spanish or Japanese, who prefer to omit the agent (“the gun went off”). One study found that, as a result, English speakers are more likely to remember the agents of accidental events—and, presumably, to attach blame.

In English, verbs express tense, that is, time relative to the moment of speaking. In Turkish, they also express the source of the information (evidentiality)—whether the information is direct, acquired through sense perception; or indirect, acquired by testimony or inference. In Russian, they include information about completion, with (to simplify) the perfective aspect used for completed actions and the imperfective aspect for ongoing or habitual actions. Spanish, on the other hand, emphasizes modes of being, with two verbs for “to be”—ser, to indicate permanent or lasting attributes, and estar, to indicate temporary states and locations. Like many languages, Spanish has more than one mode of second-person address: for intimates and social inferiors, and usted for strangers and social superiors, equivalent to tu and vous in French, and tu and lei in Italian. There used to be a similar distinction in English, with thou used to express intimacy, familiarity, or downright rudeness—but because it is archaic, many people now think of it as more formal than “you.” It stands to reason that, compared to English speakers, Turkish speakers have to pay more attention to evidentiality, Russian speakers to completion, and Spanish speakers to modes of being and social relations.

In many languages, nouns are divided into masculine and feminine. In German, there is a third, neutral class of nouns. In Dyirbal, an Aboriginal language, there are four noun classes, including one for women, water, fire, violence, and exceptional animals—or, as George Lakoff put it, “women, fire, and dangerous things.” Researchers asked speakers of German and Spanish to describe objects with opposite gender assignments in these two languages, and found that their descriptions conformed to gender stereotypes—even when the testing took place in English. For example, German speakers described bridges (feminine in German, die Brücke) as beautiful, elegant, fragile, peaceful, pretty, and slender, whereas Spanish speakers described bridges (masculine in Spanish, el puente) as big, dangerous, long, strong, sturdy, and towering.

Another study looking at the artistic personification of abstract concepts such as love, justice, or time found that, in 78% of cases, the gender of the concept in the artist’s language predicted the gender of the personification, and that this pattern held even for uncommon allegories such as geometry, necessity, and silence. Compared to a French or Spanish artist, a German artist is far more likely to paint death (der Tod, la mort, la muerte) or victory (der Sieg, la victoire, la victoria) as a man—though all artists, or European artists, tend to paint death in skeletal form. Grammar, it seems, can directly and radically influence thought, perception, and action.

It is often said that, by de-emphasizing them, language perpetuates biases against women. For example, many writers in English still use “mankind” to talk about humankind, and “he” for “he or she.” Similarly, many languages use masculine plural pronouns to refer to groups of people with at least one man. If 100 women turn up with a baby in a pram, and that baby happens to be male, French grammar dictates the use of the masculine plural ils: ils sont arrivés, “they have arrived.” Language changes as mores change, and sometimes politicians, pressure groups, and others attempt to change the language to change the mores—but, on the whole, language serves to preserve the status quo, to crystallize the order and culture that it reflects.

Language is also made up of all sorts of metaphors. In English and Swedish, people tend to speak of time in terms of distance: “I won’t be long”; “Let’s look at the weather for the week ahead”; “his drinking caught up with him.” But in Spanish or Greek, people tend to speak of time in terms of size or volume—for example, in Spanish, hacemos una pequeña pausa (“let’s have a small break”) rather than corta pausa (“short break”). More generally, mucho tiempo (“much time”) is preferred to largo tiempo (“long time”), and, in Greek, poli ora to makry kroniko diastima. And guess what? According to a recent study of fully bilingual Spanish-Swedish speakers, the language used to estimate the duration of events alters the speaker’s perception of the relative passage of time.

But all in all, and with perhaps a couple of exceptions, European languages do not differ dramatically from one another. To talk about space, speakers of Kuuk Thaayorre, an Aboriginal language, use 16 words for absolute cardinal directions instead of relative references such as “right in front of you,” “to the right,” and “over there.” As a result, even children are always aware of the exact direction in which they are facing. When asked to arrange a sequence of picture cards in temporal order, English speakers arrange the cards from left to right, whereas Hebrew speakers tend to arrange them from right to left. But speakers of Kuuk Thaayorre consistently arrange them from east to west, which is left to right if they are facing south, and right to left if they are facing north. Thinking differently about space, they think differently about time as well.

Language may not determine thought, but it focuses our perception and attention on particular aspects of reality, structures and enhances our cognitive processes, and even, to some extent, regulates our social relationships. Our language reflects and at the same time shapes our thoughts and, ultimately, our culture, which in turn shapes our thoughts and language. There is no equivalent in English of the Portuguese word saudade, which refers to the love and longing for someone or something that has been lost and may never be regained. The rise of saudade coincided with the decline of Portugal and the yen for its imperial heyday, a yen so strong as to have written itself into the national anthem: Levantai hoje de novo o splendour de Portugal (“Let us once again lift up the splendour of Portugal”). The three strands of language, thought, and culture, though individual, are so tightly woven that they cannot be prised apart.

It has been said that when an old man dies, a library burns to the ground. But when a language dies, it is a whole world that comes to an end.

See my related post, Beyond Words: The Benefits of Being Bilingual

Advertisements

New Audiobook on the Emotions

Heaven and Hell Audiobook

Heaven and Hell: The Psychology of the Emotions has just come out on Audible!

So now you can take it in the car, gym, kitchen, or wherever…

The book is narrated by the very talented Alexander Doddy, who also did Growing from Depression.

For more info, follow this link (UK) or this link (US).

 

Beyond Words: The Benefits of Being Bilingual

bilingualf

It may come as a surprise to many people in the U.S. and U.K. that speaking more than one language is the norm rather than the exception. In prehistoric times, most people belonged to small linguistic communities, and spoke several languages to trade with, and marry into, neighbouring communities. Still today, remaining populations of hunter-gatherers are almost all multilingual. Papua New Guinea, a country smaller than Spain, still counts some 850 languages, or about one language per 10,000 inhabitants. In countries such as India, Malaysia, and South Africa, most people are bilingual or better. Even in the world at large, polyglots outnumber monoglots. And with the advent of the Internet, contact with foreign languages has become increasingly frequent, even for the most linguistically isolated of monoglots.

Queen Elizabeth I of England could speak at least ten languages: English, French, Spanish, Italian, Flemish, Latin, Welsh, Cornish, Scottish, and Irish. According to the Venetian ambassador, she possessed these languages “so thoroughly that each appeared to be her native tongue”. No wonder she didn’t want to get married.

To speak a language competently implies knowledge of the culture associated with the language. Multilingualism is closely linked to multiculturalism, and, historically, both came under attack with the rise of the nation state. In the aftermath of the Brexit referendum, the British Prime Minister Theresa May stated: “If you believe you are a citizen of the world, you are a citizen of nowhere”—as though that were somehow bad or abnormal. Human beings are far older than any country. Still today, some people believe that teaching a child more than one language can impair the child’s linguistic and cognitive development. But what’s the evidence?

According to several studies, people who study a language do significantly better on standardized tests. Language management calls upon executive functions such as attention control, cognitive inhibition, and working memory; and there is mounting evidence that bi- and multi-lingual people are better at analysing their surroundings, multitasking, and problem solving. They also have a larger working memory, including for tasks that do not involve language. In terms of brain structure, they have more grey matter (and associated activity) in the dorsal anterior cingulate cortex, a locus for language control and broader executive function. Superior executive function is, in turn, a strong predictor of academic success.

According to one recent study, when faced with moral dilemmas, people who think through a problem in a foreign language make much more rational (or ‘utilitarian’) decisions, perhaps because certain words lose some of their emotional impact, or because the problem is seen from a different cultural perspective, or processed through different neural channels. So if you have a second language, you can use it, like a friend, to check yourself.

The cognitive benefits of bi- and multilingualism yield important health dividends. An examination of hospital records in Toronto uncovered that bilingual patients were diagnosed with dementia on average three to four years later than their monolingual counterparts, despite being of a similar educational and occupational status. A more recent study in Northern Italy looking at patients at the same stage of Alzheimer’s Disease revealed that the bilingual patients were on average five years older, and that they had stronger connections between the brain areas involved in executive function. Similarly, research into 600 stroke survivors in India found that the bilingual patients had a much better outcome: specifically, 40.5% of the bilingual patients had normal cognition compared to just 19.6% of the monolingual ones.

And then there are the economic benefits. A U.S. study found that high-level bilingualism is associated with extra earnings of about $3,000 a year, even after controlling for factors such as educational attainment and parental socio-economic status. According to The Economist, for an American graduate, a second language could be worth—on a conservative estimate—up to $128,000 over 40 years. Of course, the overall economic impact of multilingualism is much greater than the sum of the higher earnings of multilingual speakers. A report from the University of Geneva estimates that Switzerland’s multilingual heritage contributes about $50 billion a year to the Swiss economy, or as much as 10% of GDP. Conversely, research for the U.K. government cautions that a lack of language skills could be costing the British economy around $48 billion a year, or 3.5% of GDP, in lost output.

Being bilingual may have important cognitive and economic benefits, but it is often the personal, social, and cultural benefits that multilingual people are keen to emphasize. Many bilingual people feel that the way they are, and the way they see the world—and even the way they laugh and love—changes according to the language they are speaking. In the 1960s, Susan Ervin-Tripp asked Japanese-English bilingual women to finish sentences in each language, and found that the women came up with very different endings depending on whether they were speaking English or Japanese. For example, they completed “Real friends should…” with “…help each other” in Japanese, but “…be frank” in English. “What do you want to eat?” “Who’s your favourite poet?” Ask a question in one language, and you get one answer; ask the same question in another language, and you get another one. “To have another language” said Charlemagne, “is to have another soul.”

Translation dictionaries seem to assume that languages are made up of corresponding words, but even when that is more or less the case, the equivalencies have different connotations. Compared to “I like you” in English, “Je t’aime” in French is a far more serious proposition. Owing to a certain je ne sais quoi, some things are more readily expressed in one language than another. By code switching, multilingual speakers can increase their range of expression, and perhaps even their range of thought. “The limits of my language” said Ludwig Wittgenstein, “are the limits of my world.” Certain languages are better suited to certain purposes, for example, English is great for science and technology, French is better for cooking and seducing, and Latin is best for praying and formal rites of passage. Multilingual people are free to pick and choose, maybe along the lines of Charles V, Holy Roman Emperor: “I speak in Latin to God, Italian to Women, French to Men, and German to my Horse.” (He didn’t get on with the German lords, and preferred to live in Spain, where he happened to be the King.)

The more languages you learn, the easier it becomes to learn languages. But learning a language also strengthens your first language. For instance, one study found that Spanish immersion significantly improved children’s native English vocabulary. More broadly, learning a language casts light upon your first language and language in general, increasing your appreciation of language and ability to communicate. “You speak English beautifully,” wrote Robert Aickman in The Wine-Dark Sea, “which means you can’t be English.”

Just before writing this article, I asked my amazing Facebook and Twitter people the following question: “If you are bi- or multi-lingual, what do you most value about that fact?”

And here are some of their responses:

  • The freedom to access different cultures plus the possibility to read many authors in original version!
  • Being fluent in a couple of other languages has given me insight into other ways of seeing the world. That helps empathy, and openness.
  • I appreciate the cognitive advantages being multilingual has offered. Also the connections with culture, history, and the knowledge acquired.
  • Language is knowledge. Always useful to be a little less ignorant.
  • It feels as if I can switch into two different modes and think from different perspectives.
  • Being more tolerant—new language = new culture, new and different perspectives/access to more information.
  • That I can talk wine with twice as many ppl.
  • It gives me patience and understanding for those who want to articulate, but have difficulty conveying what they really mean.
  • The fact that I can fully understand and communicate in another language (Afrikaans) makes me feel good.
  • It gave me an understanding that ‘thoughts’ don’t come into my head in a language at all. Thoughts come as ‘ideas’. Only when I have to verbalize my ideas, I have to use a language.
  • I’m bilingual in India. There’s nothing special about it here. Know a ton of people who are trilingual. In India, multilingualism starts becoming impressive if the language count’s above like five or something.
  • Obviously, being able to speak about people in elevators without them knowing what’s said. 😉
  • Having a variety of options when cussing.

Every language has its own rules and conventions, its own sounds and rhythms, its own beauty and poetry, its own outlook and philosophy.

Every language is another way of being human, another way of being alive.

The Psychology of Snobbery

crown

The protagonist of the British sitcom Keeping Up Appearances is the social-climbing snob Hyacinth Bucket—or ‘Bouquet’, as she insists it be pronounced. To give the impression that she employs domestic staff, she famously answers her beloved pearl-white slim line telephone with, ‘The Bouquet residence; the lady of the house speaking.’ The very middle-middle class Hyacinth spends most of her efforts trying to impress others in the hope of passing off as posh, while looking down on anyone who does not meet her approval. And this is the simple recipe for five seasons of very British comedy.

It is sometimes said that the word ‘snob’ originates from the Latin sine nobilitate (‘without nobility’), used in abbreviated form—s.nob—on lists of names by Cambridge colleges, passenger ships etc. to distinguish between titled and non-titled individuals. In fact, ‘snob’ was first recorded in the late 18th century as a term for a shoemaker or his apprentice, though it is true that Cambridge students came to apply it to those outside the university. By the early 19th century, ‘snob’ had come to mean something like ‘a person who lacks breeding’, and then, as social structures became more fluid, ‘a social climber’.

Today, a snob is someone who:

  • Accords exaggerated importance to one or more superficial traits such as wealth, social status, beauty, or academic credentials,
  • Perceives people with those traits to be of higher human worth,
  • Lays claim to those traits for him- or her-self, often unduly, and
  • Denigrates those who lack those traits.

So there are three main aspects to snobbery: exaggerating the importance of certain traits, laying claim to those traits, and, last but not least, denigrating those who lack them. “I’m not a snob,” said Simon Le Bon, in jest: “Ask anybody. Well, anybody who matters.”

Snobbery is not simply a matter of discernment, however expensive or refined our tastes may be: a so-called wine ‘snob’, who enjoys and even insists on good wine, may or may not be an actual snob, depending on the degree of his or her prejudice (from the Latin praeiudicium, ‘prior judgement’). Speaking of wine, some young sommeliers, immersed as they are in the world of wine, can come to place undue value on wine knowledge, to the point of deprecating their own patrons—a phenomenon that has been referred to as ‘sommelier syndrome’.

Aside from its obvious unpleasantness to others, snobbery tends to undermine the snob, his achievements, and the interests and institutions that he represents. The Conservative Member of Parliament Jacob Rees-Mogg did himself, his party, and the U.K. parliament no favours when he compared people who did not go to private school or Oxford or Cambridge to ‘potted plants’.

Snobbery betrays rigidity of thinking and therefore poor judgement, as with those British aristocrats who, despite their expensive educations, admired Hitler’s autocratic style of government. The snob pigeonholes people according to superficial criteria such as their birth, their profession, or, especially in England, the way they speak, and, on that basis, either regards or disregards them. Like the wine lover who will only drink certain labels, the snob often passes over real value, quality, or originality. As company, he is an endless bore, constantly detracting from the rich texture of life and quite unable to marvel at anything except through himself.

Closely related to snobbery, and presenting some of the same pitfalls, is ‘inverse snobbery’. Inverse snobbery is the disdain for those same traits that the snob might hold in high regard, combined with admiration, whether real or feigned, for the popular, the ordinary, and the commonplace—and not just with the aim of winning an election. Inverse snobbery can be understood, in large part, as an ego defense against the status claims of others; and it is possible, indeed common, to be both a snob and an inverse snob.

But what about snobbery itself? Like inverse snobbery, snobbery can be interpreted as a symptom of social insecurity. Social insecurity may be rooted in childhood experiences, especially feelings of shame at being different, or an early sense of privilege or entitlement that cannot later be realized. Or it may be the simple result of rapid social change. With Brexit and the election of Donald Trump, the ebbing of power from traditional, cultured elites has led, on all sides, to a surge in both snobbery and inverse snobbery.

In a similar vein, some snobbery may represent a reaction to an increasingly egalitarian society, reflecting a deeply ingrained human instinct that some people are better than others, that these people are more fit to rule, and that their rule tends to yield better outcomes—though, of course, one need not be a snob to share that instinct. In that much, snobbery can serve as a mechanism of class surveillance and control, as can, paradoxically, inverse snobbery, serving to entrench social hierarchies.

Finally, at an extreme, snobbery may be a manifestation of narcissistic personality disorder or broader psychopathy … which points to its antidote, namely, empathy—including towards the snob.

Snobbery, said Joseph Epstein, ‘is the desire for what divides men and the inability to value what unites them.’

Summer Promotion!

Currently ‘author in focus’ at Blackwell’s Oxford

3 for 2 in the shop, and big discounts in their online store

#SummerReading

blackwell2.jpg

The Psychology and Philosophy of Memory

memory

Memory refers to the system, or systems, by which the mind registers, stores, and retrieves information for the purpose of optimizing future action.

Memory can be divided into short-term and long-term memory, and long-term memory can be further divided into episodic and semantic memory. Episodic memory records sense experiences, while semantic memory records abstract facts and concepts, with episodic memories eventually seeping into semantic memory. Interestingly, the distinction between episodic memory and semantic memory is already implicit in a number of languages in which the verb ‘to know’ has two forms, for example, in French, connaître and savoir, where connaître implies a direct, privileged kind of knowledge acquired through sense experience.

There is, naturally, a close connection between memory and knowledge. The connaître and savoir dichotomy is also pertinent to the theory of knowledge, which distinguishes between first-hand knowledge and testimonial knowledge, that is, knowledge gained through the say-so of others, often teachers, journalists, and writers. In the absence of first-hand knowledge, the accuracy of a piece of testimony can only be verified against other sources of testimony. Similarly, the accuracy of most memories can only be verified against other memories. For most if not all memories, there is no independent standard.

Episodic and semantic memory are held to be explicit or ‘declarative’, but there is also a third kind of memory, procedural memory, which is implicit or unconscious, for knowing how to do things such as reading and cycling. Although held to be explicit, episodic and semantic memory can influence action without any need for conscious retrieval—which is, of course, the basis of practices such as advertising and brainwashing. In fact, it is probably fair to say that most of our memories lie beyond conscious retrieval, or are not consciously retrieved—and therefore that memory mostly operates at an unconscious level. ‘Education’, said BF Skinner, ‘is what survives when what has been learnt has been forgotten.’

A mysterious type of memory is prospective memory, or ‘remembering to remember’. To send my mother a birthday card, I must not only remember her birthday, but also remember to remember it. Whenever I omit to set my alarm clock, I find myself waking up just in time to make my first appointment, even when I have only slept three or four hours. This suggests that, even in sleep, the mind is able to remember to remember, while also keeping track of the time.

Memory is encoded across several brain areas, meaning that brain damage or disease can affect one type of memory more than others. For example, Korsakov syndrome, which results from severe thiamine deficiency and consequent damage to the mammillary bodies and dorsomedial nucleus of the thalamus, affects episodic memory more than semantic memory, and anterograde memory (ability to form new memories) more than retrograde memory (store of old memories), while sparing short-term and procedural memory. Alzheimer’s disease on the other hand affects short-term memory more than long-term memory, at least in its early stages.

As a psychiatrist, I am often asked to assess people with advanced Alzheimer’s Disease and other forms of dementia, and am all too aware of the importance of memory in our daily lives. Without any memory at all, it would be impossible to: speak, read, learn, find one’s way, make decisions, identify or use objects, cook, wash, dress, or develop and maintain relationships. To live without memory is to live in a perpetual present, without past, and without future. One couldn’t build upon anything, or even engage in any kind of sustained, goal-directed activity. Although there is wisdom in being in the moment, one cannot always or entirely be in the moment. In Greek myth, the goddess of memory, Memosyne, slept with Zeus for nine consecutive nights, thereby begetting the nine Muses. Without memory, there would be no art or science, no craft or culture.

And no meaning either. Nostalgia, sentimentality for the past, is often prompted by feelings of loneliness, disconnectedness, or meaninglessness. Revisiting our past can lend us much-needed context, perspective, and direction, reminding and reassuring us that our life is not as banal as it might seem, that it is rooted in a narrative, and that there have been—and will once again be—meaningful moments and memories. If weddings and wedding photographs are anything to go by, it seems that we go to considerable lengths to manufacture these anchor memories. Tragically, people with severe memory loss can no longer revisit the past, and may resort to confabulation (the making up of memories) to create the meaning and identity that everyone yearns for. I once visited a nursing home in England to assess an 85-year-old lady with advanced Alzheimer’s disease. She insisted that we were in a hotel in Marbella: she was planning her wedding and didn’t have time to talk to me. When I asked her what she did the day before, she replied, with a twinkle in her eye, that she hit the town for her bachelorette (hen night), and that her glamorous friends spoilt her rotten with champagne and fancy cocktails. The search for meaning is deeply ingrained in human nature, so much so that, when pressed to define man, Plato replied simply, ‘a being in search of meaning’.

It could be argued that, like confabulation, nostalgia is a form of self-deception, in that it involves distortion and idealization of the past. The Romans had a tag for the phenomenon that psychologists have come to call ‘rosy retrospection’: memoria praeteritorum bonorum, ‘the past is always well remembered.’ And memory is unreliable in other ways as well. ‘Everyone’, said John Barth, ‘is necessarily the hero of his own life story’. We curate our memories by consolidating those that confirm or conform to our idea of self, while discarding or distorting those that conflict with it. We are very likely to remember events of existential significance such as our first kiss, or our first day at school—and, of course, it helps that we often rehearse those memories. Even then, we remember just one or two scenes, and just the main features, and fill in the gaps and background with reconstructed or ‘averaged’ memories. Déjà-vu, the feeling that a situation that is currently being experienced has already been experienced, may arise from a near match between the current situation and an averaged memory of that sort of situation. Our memories are filtered through our interests and emotions, and through our interpretation of events. Two people supporting opposing teams in a football match, or opposing political parties in an election, will register and recall very different things, and would likely disagree about ‘the facts’.

Broadly speaking, emotionally charged events are more likely to be remembered, and it has been found that injections of cortisol or epinephrine (adrenaline) can improve retention rates. But if a situation is highly stressful, memory may be impaired as cognitive resources are diverted to dealing with the situation, for example, escaping from the gunman rather than registering his clothing or facial features. In addition, any attention paid to the gunman is likely to focus on the gun itself, leading to a kind of peripheral blindness. This has important implications for the accuracy of eyewitness testimony, which might also be distorted by the use of leading or loaded questions. In a famous study, Reconstruction of Automobile Destruction, Loftus and Palmer asked people to estimate the speed of motor vehicles when they smashed / collided / bumped / hit / contacted each other, and found that the verb used in the question altered perceptions of speed. In addition, those asked the ‘smashed’ question were subsequently more likely to report having seen broken glass. After a traumatic event, to cope with unbearable stress, a person might go so far as to dissociate from the event, for example, by losing all memory for the event (dissociative amnesia) or even, as Agatha Christie once did, assuming another identity and departing on a sudden, unexpected journey (dissociative fugue). So emotion improves memory, but stress and trauma hinder it.

It is generally thought that, of all the senses, it is the sense of smell that triggers the most vivid memories. The olfactory bulb has direct connections to the amygdala and hippocampus, which are heavily involved in memory and emotion. These three structures—the olfactory bulb, the amygdala, and the hippocampus—form part of the limbic system, a ring of phylogenetically primitive, ‘paleomammalian’ cortex that is the seat of memory, emotion, and motivation. In a famous passage now referred to as ‘the madeleine moment’, Marcel Proust describes the uncanny ability of certain smells to recapture the ‘essence of the past’:

No sooner had the warm liquid mixed with the crumbs touched my palate than a shudder ran through me and I stopped, intent upon the extraordinary thing that was happening to me. An exquisite pleasure had invaded my senses, something isolated, detached, with no suggestion of its origin. And at once the vicissitudes of life had become indifferent to me, its disasters innocuous, its brevity illusory – this new sensation having had on me the effect which love has of filling me with a precious essence; or rather this essence was not in me it was me. … Whence did it come? What did it mean? How could I seize and apprehend it? … And suddenly the memory revealed itself. The taste was that of the little piece of madeleine which on Sunday mornings at Combray (because on those mornings I did not go out before mass), when I went to say good morning to her in her bedroom, my aunt Léonie used to give me, dipping it first in her own cup of tea or tisane. The sight of the little madeleine had recalled nothing to my mind before I tasted it. And all from my cup of tea.

Killing two birds with one stone, here are 10 ways to improve your memory that also shed light on its workings:

1. Get enough sleep. If you read a book or article when very tired, you will forget most of what you have read. Sleep improves attention and concentration, and therefore the registration of information. And sleep is also required for memory consolidation.

2. Pay attention. You cannot take in information unless you are paying attention, and you cannot memorize information unless you are taking it in. It helps if you are actually interested in the material, so try to develop an interest in everything! As Einstein said, ‘There are only two ways to live your life. One is as though nothing is a miracle. The other is as though everything is a miracle.’

3. Involve as many senses as you can. For instance, if you are sitting in a lecture, jot down a few notes. If you are reading a chapter or article, read it aloud to yourself and inject some drama into your performance.

4. Structure information. If you need to remember a list of ingredients, think of them under the subheadings of starter, main, and dessert, and visualize the number of ingredients under each subheading. If you need to remember a telephone number, think of it in terms of the first five digits, the middle three digits, and the last three digits—or whatever works best.

5. Process information. If possible, summarize the material in your own words. Or reorganize it so that it is easier to learn. With more complex material, try to understand its meaning and significance. Shakespearean actors find it much easier to remember their lines if they can understand and feel them. Focus on the bigger picture, not the details, and don’t remind everyone of everything. In the words of Oscar Wilde, ‘One should absorb the colour of life, but one should never remember its details. Details are always vulgar.’

6. Relate information to what you already know. New information is much easier to remember if it can be contextualized. In a recent study looking at the role of high-level processes, Lane and Chang found that chess knowledge predicts chess memory (memory of the layout of a particular game of chess) even after controlling for chess experience.

7. Use mnemonics. Tie information to visual images, sentences, and acronyms. For example, you might remember that your hairdresser is called Sharon by picturing a Rose of Sharon or a sharon fruit. Or you might remember the colours of the rainbow and their order by the sentence, ‘Richard Of York Gave Battle in Vain’. Many medical students remember the symptoms of varicose veins by the acronym ‘AEIOU’: Aching, Eczema, Itching, Oedema, and Ulceration.

8. Rehearse. Sleep on the information and review it the following day. Then review it at growing intervals until you feel comfortable with it. Memories fade if not rehearsed, or are overlain by other memories and can no longer be accessed.

9. Be aware of context. It is easier to retrieve a memory if you find yourself in a similar situation, or similar state of mind, to the one in which the memory was formed. People with low mood tend to remember their losses and failures while overlooking their strengths and achievements. If one day you pass the cheesemonger in the street, you may not, without her usual apron and array of cheeses, immediately recognize her as the cheesemonger, even though you know her fairly well. If you are preparing for an exam, try to recreate the conditions of the exam: for example, sit at a similar desk, at a similar time of day, and use ink on paper.

10. Be creative. Bizarre or unusual experiences, facts, and associations are much easier to remember. Because unfamiliar experiences stick in the mind, trips and holidays give the impression of ‘living’, and, therefore, of having lived a longer life. Our life is just as long or short as our remembering: as rich as our imagining, as vibrant as our feeling, and as profound as our thinking.

The Problem of Knowledge

knowledge

Real knowledge is to know the extent of one’s own ignorance. —Confucius

What if we are being radically deceived? What if I am no more than a brain kept alive in a vat and fed with stimuli by a mad scientist? What if my life is but a dream or computer simulation? Like the prisoners in Plato’s cave, I would be experiencing not reality itself, but a mere facsimile. I couldn’t be said to know anything at all, not even that I was being deceived. Given the choice between a life of limitless pleasure as a brain in a vat and a genuine human life along with all its pain and suffering, most people opt for the latter, suggesting that we value truth and authenticity, and, by extension, that we value knowledge for its own sake.

But even if we’re not being deceived, it is not at all clear that we can have any knowledge of the world. Much of our everyday knowledge comes from the use of our senses, especially sight. ‘Seeing is believing’, as the saying goes. French is one of many languages that has two verbs for ‘to know’: savoir and connaître, where connaître implies a kind of direct, privileged kind of knowledge acquired through sense experience. But appearances, as we all know, can be deceptive: a stick held under water appears to bend, the hot tarmac in the distance appears like a sparkling lake, and almost 40% of the normal population have experienced hallucinations of some kind, such as hearing voices. Our sense impressions are also subject to manipulation, as, for example, when a garden designer uses focal points to create an illusion of space. My mind interprets a certain wavelength as the colour red, but another animal or even another person may interpret it as something entirely else. How do I know that what I experience as pain is also what you experience as pain? You may react as I do, but that need not mean that you are minded like I am, or even that you are minded at all. All I might know is how the world appears to me, not how the world actually is.

Beyond my immediate environment, much of what I count as knowledge is so-called testimonial knowledge, that is, knowledge gained by the say-so of others, often teachers, journalists, and writers. If a piece of testimonial knowledge conflicts with our worldview, we tend, in the absence of non-testimonial evidence, to check it against other forms of testimony. If a friend tells me that Melbourne is the most populous city in Australia, I might carry out an Internet search and find that it is actually Sydney, even though I have never been to Australia and cannot be sure of what I read on the Internet.

Knowing that Sydney is the most populous city in Australia is a case of declarative (or propositional) knowledge, knowledge that can be expressed in declarative sentences or propositions. I know, or think that I know, that ‘Prince Harry is married to Meghan Markle’, ‘Paris is the capital of France’, and ‘democracy is the least worst form of government’. Apart from declarative knowledge, I also have know-how, for example, I know how to cook and how to drive a car. The relationship between knowing that and knowing how is not entirely clear, though it may be that knowing how collapses into multiple instances of knowing that.

For me to know something, say, that Mount Athos is in Greece, it must be the case that (1) I believe that Mount Athos is in Greece, and (2) Mount Athos is actually in Greece. In short, knowledge is true belief. True beliefs are better than false beliefs because they are, in general, more useful. Some beliefs, such as that my wine has been poisoned, are more useful than others, such as that my neighbour has 423 stamps in her collection. Some true beliefs, such as that I am a coward, can even be unhelpful, and we deploy a number of psychological mechanisms such as repression and rationalization to keep them out of mind. Inversely, some false beliefs, such as that my country or football team is the best, can be helpful, at least for my mental health. But on the whole we should seek to maximize our true beliefs, especially our useful or otherwise valuable true beliefs, while minimizing our false beliefs.

If knowledge is true belief, it is not any kind of true belief. People with paranoid psychosis often believe that they are being persecuted, for example, that the government is trying to have them killed. Clearly, this cannot count as knowledge, even if, by coincidence, it happens to be true. More generally, beliefs that are held on inadequate grounds, but by luck happen to be true, fall short of knowledge. In the Meno, Plato compares these true beliefs, or ‘correct opinions’, to the statues of Daedalus, which run away unless they can be tied down ‘with an account of the reason why’, whereupon they become knowledge. Knowledge, therefore, is not mere true belief, but justified true belief. Knowledge as justified true belief is called the tripartite, or three-part, theory of knowledge. Setting aside any intrinsic value that it may have, knowledge is more useful than mere true belief because it is more stable, more reliable.

Fine, but what does justification demand? I justify my belief in manmade global warming by the current scientific consensus as reported by the press. But what justifies my belief in the current scientific consensus, or in the press reports that I have read? Justification seems to involve an infinite regress, such that our ‘justified’ true beliefs have no solid foundation to rest upon. It may be that some of our beliefs rest upon certain self-justifying foundational beliefs such as the famous I think therefore I am of Descartes. But few beliefs are of this kind, and those that are seem unrelated to the bulk of my beliefs. In practice, most of our beliefs seem to rest upon a circular or circuitous chain of justification, which, if large enough, might be held to constitute adequate justification. The problem, though, is that people can choose to live in different circles.

knowledge2

People typically justify, or try to impose, their beliefs by means of arguments. Arguments provide reasons (or premises) in support of a particular claim or conclusion. There are two broad kinds of argument, deductive and inductive. In a deductive or ‘truth-preserving’ argument, the conclusion follows from the premises as their logical consequence. In an inductive argument, the conclusion is merely supported or suggested by the premises. More often than not, arguments are implicit, meaning that their rational structures are not immediately apparent and need to be made explicit by analysis.

A deductive argument is valid if the conclusion flows from the premises, regardless of the truth or falsity of the premises.

All organisms with wings can fly. (Premise 1, False)

Penguins have wings. (Premise 2, True)

Therefore, penguins can fly. (Conclusion, False)

This deductive argument is valid, even if it is unsound. For a deductive argument to be both valid and sound, all of its premises have to be true.

All mammals are warm-blooded. (Premise 1, True)

Bats are mammals. (Premise 2, True)

Therefore, bats are warm-blooded. (Conclusion, True)

Though a deductive argument appears to bring out a truth, that truth was already contained in the premises. For an inductive argument, the equivalent of soundness is cogency. An inductive argument is cogent if its premises are true and they render the truth of the conclusion probable. Every flamingo that I’ve ever seen has been pink. Therefore, it’s very probable that all flamingos are pink, or that flamingos are generally pink.

A third form of reasoning, abductive reasoning, involves inference to the best explanation for an observation or set of observations, for example, diagnosing a disease from a constellation of symptoms. But once broken down, abductive reasoning can be understood as a shorthand form of inductive reasoning.

Obviously, arguments often fall short. A logical fallacy is some kind of defect in an argument, and may be unintentional or intentional (with the aim to deceive). A formal fallacy is a deductive argument with an invalid form: the argument is invalid regardless of the truth of its premises. An informal fallacy is an argument that can only be identified by an analysis of the content of the argument. Informal fallacies are frequently found in inductive arguments, and often turn on the misuse of language, for example, using an ambiguous word with one meaning in one part of the argument and another in another (fallacy of equivocation). Informal fallacies can also distract from the weakness of the argument, or appeal to the emotions rather than to reason: “Will someone please think of the children!”

Science principally proceeds by induction, through the study of large and representative samples. An important problem with inductive reasoning is that the observations involved do not in themselves establish its validity, except by induction! A turkey that is fed every morning without fail expects to be fed every morning, until the day the farmer wrings its neck. For this reason, induction has been called ‘the glory of science and the scandal of philosophy’. This is an even bigger problem than it seems, since inductive arguments usually supply the premises for deductive arguments, which, as we have seen, are merely a priori. The 20th century philosopher Karl Popper argued that science actually proceeds by deduction, by making bold generalizations and then seeking to falsify them (or prove them wrong). He famously argued that if a proposition cannot be falsified, then it is not in the realm of science. But if Popper is right, then science could never tell us what is, but only what is not.

As we have seen, justification is hard to come by. But there is another problem lurking in the tripartite theory of knowledge. In 1963, Edmund Gettier published a two-and-a-half page paper showing that it is possible to hold a justified true belief without this amounting to knowledge. Here is my own example of a Gettier-like case. Suppose I am sleeping in my bed one night. Suddenly, I hear someone trying to unlock the front door. I call the police to share my belief that I am about to be burgled. One minute later, the police arrive and apprehend a burglar at my door. But it was not the burglar who made the noise: it was a drunken student who, coming home from a party, mistook my house for his own. While my belief was both true and justified, I did not, properly speaking, have knowledge. Responses to the Gettier problem typically involve elaborating upon the tripartite theory, for example, stipulating that luck or false evidence should not be involved. But these elaborations seem to place the bar for knowledge far too high.

As Gettier made clear, it is not so easy to identify instances of knowledge. Instead of defining the criteria for knowledge and, from these criteria, identifying instances of knowledge, it might be easier to work the other way, that is, begin by identifying instances of knowledge and, from these instances, derive the criteria for knowledge. But how can we identify instances of knowledge without having first defined the criteria for knowledge? And how can we define the criteria for knowledge without having first identified instances of knowledge? This Catch-22, in one form or another, seems to lie at the bottom of the problem of knowledge.

Previous Older Entries

%d bloggers like this: