In your opinion, is the human being first and foremost rational? This is a big question that can give rise to endless philosophical debates. But from the strict point of view of the sciences that study how our box of thoughts works, we have a good idea of the answer… even if we still have a lot to learn about this fascinating organ that is our brain.

What we’ve learned in recent years, thanks in part to advances in neuroscience, is shedding new light on the way we think. The value of these discoveries is immense since they offer us keys not only to thinking more intelligently, but also to better educating our young people and making our society more tolerant, creative and adaptable. Let’s dive in!

Thinking in shortcut mode

It was in the 1960s that scientists began to question the idea that human beings would behave primarily rationally. The English psychologist Peter Cathcart Wason is credited with pointing out the confirmation bias, the tendency we have to place more importance on information that confirms our hypotheses rather than that which disproves them. This discovery was part of Wason’s questioning of the difficulty for researchers to integrate Popper’s criterion of refutability into the scientific process; it should be remembered that this criterion implies that a scientific process should be based on experiments that may contradict the hypothesis, rather than trying to validate it (Karl Popper, The Logic of Scientific Knowledge, 1934). To confirm the existence of confirmation bias, Wason developed a test of logical reasoning, the “Wason selection task” or “Wason’s four-card task”, which very few of us pass.

However, the concept of cognitive bias – which differs from that of cognitive distortion derived from clinical psychology – was developed by psychologists Daniel Kahneman (2002 Nobel Prize in Economics) and Amos Tversky. They used it in the early 1970s to explain certain tendencies to make irrational and erroneous decisions in the economic field. Cognitive bias is a form of thinking that appears to be erroneously based on logical and rational reasoning, leading to biased analysis and judgments. These shortcuts of the mind are mostly unconscious and systematic. Thus, although they can be acted upon after the fact, the brain cannot be prevented from producing them.

Cognitive bias is often confused with heuristic thinking (or judgmental); another notion brought to light by the duo of psychologists. A heuristic is a cognitive strategy that we all regularly use to make a decision, make a judgment, solve a problem, predict a value or estimate a probability. In other words, it is a mental shortcut that reduces the amount of relevant information to be considered and does not involve all the rational processes that should be involved in overall analytical reasoning.

So far, it is indeed difficult to distinguish heuristics from cognitive bias. The nuance is that, unlike cognitive bias, a heuristic is a cognitive strategy that often proves to be very efficient… but not always; and in the latter case, it generates a cognitive bias. As with cognitive bias, there are various types of heuristics. For example, the expertise heuristic, which consists of giving more weight to the arguments of an expert than to those of a beginner in the same field, can be useful and effective if the “expert’s” skills are not only recognized in his/her field but also if the field itself is based on sound knowledge. If this is not the case, this heuristic will either lead either to overestimate the value of the expert’s arguments or to underestimate those of the beginner. And then it will become a cognitive bias!

Following in the footsteps of the duo system

While the American philosopher and psychologist William James (1842-1910) was the first to hypothesize that we would have two types of thoughts, it was Daniel Kahneman who first promoted the idea that our cognitive system would have two systems of thought: one fast, intuitive and emotional, known as “system 1” (S1), and the other slower, reflective and logical, known as “system 2” (S2). In his book System 1, System 2: The Two Speeds of Thought, published in 2012, Kahneman synthesizes the discoveries made in the 20th century about the psychology of reasoning and decision making. It should be noted that the idea of “systems” is a metaphor and that these processes, which are otherwise relatively independent of each other, are based on various physicochemical mechanisms in the brain.

The development of S1 would be a legacy from the past, where our survival depended largely on our ability to respond quickly to our environment. This cognitive system is the one we use, among other things, to identify emotions on a face, but also, thanks to its ability to make intuitive associations, to generate our creative impulses. More “lazy”, it also consumes less glucose — the brain’s fuel — than system 2, and is, therefore, our default reasoning system. “The Law of Least Effort applies as much to thinking as it does to physical effort. If several approaches [to decision making] can achieve the same goal, the method that uses the least amount of energy will be the most popular,” Kahneman explains in his book. The S1 consists of heuristics, which are sometimes correct and sometimes incorrect, and in the latter case lead to cognitive biases. “System 1 aims at a consistent interpretation of external stimuli, developing very convincing, but mostly wrong and over-simplistic scenarios,” says Kahneman. S1 considers all new information as accurate and looks for information that supports its view.

The more analytical S2 is slower than the S1 and requires some degree of attention and concentration. It is the S2 that turns on to solve complex or new problems that the S1 cannot answer. It is also activated when an event occurs that goes against the S1’s vision of the world; the effect of surprise that one feels is accompanied by a surge of conscious attention. Kahneman also links this surprise effect to learning: “… you are more likely to learn something by being surprised by your own behaviour than by learning surprising facts about people in general.” This link has been corroborated by neuroscientist Stanislas Dehaene in his research on the conditions necessary for learning (see Learning in 4 Steps). S2 is also at the origin of the state of flow, that state of deep psychological well-being, concentration and motivation that one feels when one is fully engaged in an activity (see 8 key elements of learner engagement).

However, the S2 alone should not be considered as our faculty of thought, since both systems are always involved in the formation of a decision or a judgement. In order to use as little energy as possible, according to Kahneman’s model, the brain first resorts to the intuitions of S1 and then switches to S2 if necessary. The power of this second system over the first one varies from one person to another, and according to Kahneman, it is a determining factor in our ability to identify and correct cognitive biases.

Elucidating the paradox between Piaget and Kahneman

The strategies of the S2 were studied in the 20th century by Jean Piaget (1896-1980), who hypothesized that, as a result of cognitive development, this rational system — called “logical-mathematical” in Piaget’s theory — took precedence over the “illogical” system, in adulthood. This conception of the development of intelligence is called “linear” or “staircase model”, with its different well-defined evolutionary stages progressing from automatisms (in youth) to reflexive thinking (in adulthood). As stated above, Kahneman maintains, on the contrary, that even in the adult brain the two forms of thought not only exist side by side but that the automatisms of S1 dominate the reflexive thoughts of S2. A new generation of researchers in developmental psychology, including Olivier Houdé, director of the LaPsyDÉ CNRS Laboratory at the Sorbonne, have attempted to resolve this paradox between Piaget’s and Kahneman’s models.

For their studies on reasoning and decision-making, Houdé and his team used functional brain imaging (fMRI) in addition to experimental cognitive psychology techniques. A tool of choice in neuroscience, fMRI makes it possible to finely reconstruct the brain’s activity in real-time. The researchers found two systems similar to Kahneman’s model: one they called a “heuristic system” – Kahneman’s S1 – which is fast, but not always reliable, and produces automatic and intuitive thinking; and a second they called an “algorithmic system”, which produces logical-mathematical thought – Kahneman’s S2 or Piaget’s logical-mathematical thinking. This second system is slower than the first but much more reliable. At all times, our brain works in one mode or the other.

The third system… or the key to intelligence?

How does Olivier Houdé’s model differ from Kahneman’s? By the discovery of a third system! Housed in the prefrontal cortex, the cortex of cognitive control, abstraction and logic, this third system, known as the “inhibition” system, is, according to Houdé, nothing less than the “key to intelligence“. Its arbitration function allows the heuristic system to be interrupted at the appropriate moments and the algorithmic system to be activated. “It’s as if the DNA of knowledge had been discovered,” Houdé says. In his book Apprendre à résister (2014, new ed. 2017), which presents the fruit of 20 years of research in his laboratory, Houdé uses the term “cognitive resistance system” instead, that would be better accepted by the general public, since the term “inhibition” has a negative connotation, whereas, in this model, its role is eminently positive. The use of the word “resistance” nevertheless requires clarification: the enemy of resistance, in this case, is not external, but internal since it is the heuristics produced by our own brain.

Unlike the first two systems, which would develop in parallel from birth, the inhibition system would appear later. “Anatomically, the inhibitory system is the area of the brain that develops the latest and slowest. The maturation of the prefrontal cortex does not begin until 12 months of age and lasts until adulthood. This is why children, like adults, have difficulty inhibiting. This is a fundamental fact for education: this is what we have to work on,” says Houdé.

Intelligence consists of arbitrating, i.e. determining the situations in which reflection must replace spontaneity. Learning to resist, to overcome our automatisms, is both the motor of human development and a watchword for our times.

– Olivier Houdé (Sciences Humaines)

The maturation of the prefrontal cortex continues more precisely until the early thirties, and despite the development of our logical thinking, intuition and heuristics remain in adulthood. Cognitive development is therefore not linear, as Piaget suggested, but rather sawtooth-shaped. “The performance gap (success/failure) is the rule of cognitive development, not the exception! To develop is not only to build and activate new cognitive strategies but also to learn how to inhibit existing strategies that compete in the brain,” explains the director of the LaPsyDÉ Laboratory.

Competition and “interference” are so prevalent in the brain that the algorithmic system’s logical-mathematical reflective thinking is easily hindered, even in young adults who are considered very “rational.” As Houdé points out: “It doesn’t actually take much for these algorithms to be short-circuited by perceptual traps, an emotion or belief.” He and his team have seen this in the reasoning and decisions of young adults, including engineering students. They noted that in some tasks, subjects were prone to make basic errors in deductive logic, even though they were familiar with the rules of logic (“algorithms”) involved. Repeating these already mastered rules would not help to guard against such errors… the solution would rather be to develop one’s “cognitive resistance”, thus strengthening one’s third system.

Developing Cognitive Resilience

“To put it simply, cognitive resistance is learning to think against yourself!” summarizes Olivier Houdé in a video on the subject. According to his findings, the prefrontal cortex, which we all possess and which is the seat of our third system, is “poorly trained and surprisingly under-utilized”. And practising logic is not enough, as we have seen, to improve one’s cognitive resistance. To become better at inhibiting one’s automatisms and activating one’s thinking, the key is rather to train oneself in very concrete situations to doubt, analyze, sort and order the information received. In short, this training leads to thinking about one’s own thought process, which is called metacognition (see Metacognition 101). Three types of approaches can be adopted to inhibit one’s automatisms better: “Adults, like children, can learn to inhibit inadequate strategies in three ways: either through their own experience based on their failures (denial of predictions, observation of errors) or by imitation or instructions from others,” says Houdé.

It is also essential to develop the reflex to take a step back from one’s emotions. “Our decisions are too often subjective, too fast and, even if the emotion is generally good, too emotional. We have to learn to regret somehow or anticipate the regret of our responses,” adds Houdé, pointing out that our use of digital devices means that decision-making is more than ever at the heart of our brain activity. Never before have human beings made so many decisions in a single day… Not to mention that, according to the researcher, our new technological habits are far from arming us against these automatisms, quite the contrary: “In the world of screens, everything is done to reinforce this rapid heuristic mode, Kahneman’s System 1. And we know that the coexistence of approximate heuristics and exact algorithms is never favourable to algorithms.” We’ve been warned!

It is very difficult to think freely. Our beliefs have endless roots in our distant past, our education, the social environment in which we live, the media discourse and the dominant ideology. Sometimes they prevent us from thinking in a real sense.

– Olivier Houdé (Cerveau et psycho)

Ideally, one should start in childhood to demystify the functioning of the brain’s three thought systems and develop its cognitive resistance. The discovery of the inhibitory system challenges the traditional view of learning and, by extension, teaching. “Teaching is always based on the idea of accumulating and activating cognitive functions, and never on the idea of working on the capacities of inhibition,” says Houdé. If the repetition of information is necessary to retain and store knowledge, it also leads to the accumulation of cognitive automatisms. That’s why we must, at the same time, equip young brains to counter these by introducing them to doubt and curiosity. “In my laboratory, we have developed scales of doubt from 0 to 6. And the child is asked if he or she is very sure of his or her answer and how he or she fits on the scale. We do the same thing for curiosity and regret. If we do it systematically, it means we’re teaching the child to awaken these emotions at the same time,” says Houdé, adding that the exercise is not only essential for learning at school, but also for tolerance. And according to him, in adults the benefits of cognitive resistance would be no less than in young people: “We can imagine that everywhere, if people learned to think against themselves, we would increase the potential for creativity, social tolerance, tolerance of points of view and, necessarily, a system like that would be more adapted, more evolutionary.”

So, do we start cognitive resistance training right away?

Catherine Meilleur

Author:
Catherine Meilleur

Creative Content Writer @KnowledgeOne. Questioner of questions. Hyperflexible stubborn. Contemplative yogi.

Catherine Meilleur has over 15 years of experience in research and writing. Having worked as a journalist and educational designer, she is interested in everything related to learning: from educational psychology to neuroscience, and the latest innovations that can serve learners, such as virtual and augmented reality. She is also passionate about issues related to the future of education at a time when a real revolution is taking place, propelled by digital technology and artificial intelligence.