top of page
0howwelearn.png

How We Learn:

Why Brains Learn Better

Than Any Machine...for Now

by Stanislas Dehaene

After finishing this book in January of 2022, I wrote,

 

"Wow! This book reinforced some important facts I already knew, but it also gave me new insights into how we can all have more fun learning even better than we already do."

 

My clippings below collapse a 352-page book into 10 pages, measured by using 12-point type in Microsoft Word." 

​

See all my book recommendations.  

​

Here are the selections I made:

Two years later, DeepMind engineers used what they had learned from game playing to solve an economic problem of vital interest: How should Google optimize the management of its computer servers? The artificial neural network remained similar; the only things that changed were the inputs (date, time, weather, international events, search requests, number of people connected to each server, etc.), the outputs (turn on or off this or that server on various continents), and the reward function (consume less energy). The result was an instant drop in power consumption. Google reduced its energy bill by up to 40 percent and saved tens of millions of dollars—even after myriad specialized engineers had already tried to optimize those very servers. Artificial intelligence has truly reached levels of success that can turn whole industries upside down.

 

Characteristic of the human species is a relentless search for abstract rules, high-level conclusions that are extracted from a specific situation and subsequently tested on new observations.

 

Learning, in this sense, therefore means managing an internal hierarchy of rules and trying to infer, as soon as possible, the most general ones that summarize a whole series of observations.

 

One team even managed to present a pattern of light to fetuses through the wall of the uterus.21 Surprisingly, the researchers showed that three dots arranged in the shape of a face () attracted the fetus more than three dots arranged in the shape of a pyramid (). Face recognition seems to start in utero!

 

Babies quickly notice that certain sounds are not used in their language: English speakers never utter vowels like the French /u/ and /eu/, and Japanese locutors fail to differentiate between /R/ and /L/. In just a few months (six for vowels, twelve for consonants), the baby’s brain sorts through its initial hypotheses and keeps only the phonemes that are relevant to the languages that are present in its environment.

 

As Darwin said in The Descent of Man (1871), language “certainly is not a true instinct, for every language has to be learnt,” but it is “an instinctive tendency to acquire an art.”

 

In the human species, the peak of synaptic overproduction ends around two years of age in the visual cortex, three or four years of age in the auditory cortex, and between five and ten years of age in the prefrontal cortex.

 

One of them, Emmanuel Giroux, is a true giant of mathematics and currently heads a laboratory of sixty people at the École normale supérieure in Lyon. Blind since the age of eleven, he is most well-known for his beautiful proof of an important theorem of contact geometry.

 

As Emmanuel Giroux says, paraphrasing The Little Prince, “In geometry, what is essential is invisible to the eye. It is only with the mind that you can see well.” In mathematics, sensory experiences do not matter much; it is the ideas and concepts that do the heavy lifting.

 

When we subtract two numbers, say, 9 − 6, the time that we take is directly proportional to the size of the subtracted number34—so it takes longer to perform 9 − 6 than, say, 9 − 4 or 9 − 2. Everything happens as if we have to mentally move along the number line, starting from the first number and taking as many steps as the second number: the further we have to go, the longer we take. We do not crunch symbols like a digital computer; instead, we use a slow and serial spatial metaphor, motion along the number line. Likewise, when we think of a price, we cannot help but attribute to it a fuzzier value when the number gets larger—a remnant of our primate-based number sense, whose precision decreases with number size.35 This is why, against all rationality, when we negotiate, we are ready to give up a few thousand dollars on the price of an apartment and, the same day, bargain a few quarters on the price of bread: the level of imprecision that we tolerate is proportional to a number’s value, for us just as for macaques. 

The persistence of such analog effects in an educated brain betrays the ancient roots of our concept of numbers.

 

This is a revolution: for millions of years, evolution had been content with fuzzy quantities. Symbol learning is a powerful factor for change: with education, all our brain circuits are repurposed to allow for the manipulation of exact numbers.

 

This region concentrates our learned knowledge of letter strings, to such an extent that it can be considered as our brain’s “letter box.” It is this brain area, for instance, that allows us to recognize a word regardless of its size, position, font, or cAsE, whether UPPERCASE or lowercase.39 In any literate person, this region, which is located in the same spot in all of us (give or take a few millimeters), serves a dual role: it first identifies a string of learned characters, and then, through its direct connections to language areas,40 it allows those characters to be quickly translated into sound and meaning. What would happen if we scanned an illiterate child or adult as she progressively learned to read? If the theory is correct, then we should literally see her visual cortex reorganize. The neuronal recycling theory predicts that reading should invade an area of the cortex normally devoted to a similar function and repurpose it to this novel task. In the case of reading, we expect a competition with the preexisting function of the visual cortex, which is to recognize all sorts of objects, bodies, faces, plants, and places.

 

Could it be that we lose some of the visual functions that we inherited from our evolution as we learn to read? Or, at the very least, are these functions massively reorganized? This counterintuitive prediction is precisely what my colleagues and I tested in a series of experiments. To draw a complete map of the brain regions that are changed by literacy, we scanned illiterate adults in Portugal and Brazil, and we compared them to people from the same villages who had had the good fortune of learning to read in school, either as children or adults.41 Unsurprisingly perhaps, the results revealed that, with reading acquisition, an extensive map of areas had become responsive to written words (see figure 14 in the color insert). Flash a sentence, word by word, to an illiterate individual, and you will find that their brain does not respond much: activity spreads to early visual areas, but it stops there, because the letters cannot be recognized. Present the same sequence of written words to an adult who has learned to read, and a much more extended cortical circuit now lights up, in direct proportion to the person’s reading score. The areas activated include the letter box area, in the left occipitotemporal cortex, as well as all the classical language regions associated with language comprehension. Even the earliest visual areas increase their response: with reading acquisition, they seem to become attuned to the recognition of small print.42 The more fluent a person is, the more these regions are activated by written words, and the more they strengthen their links: as reading becomes increasingly automatic, the translation of letters into sounds speeds up.

 

But we can also ask the opposite question: Are there regions that are more active among bad readers and whose activity decreases as one learns to read? The answer is positive: in illiterates, the brain’s responses to faces are more intense. The better we read, the more this activity decreases in the left hemisphere, at the exact place in the cortex where written words find their niche—the brain’s letter box area. It’s as if the brain needs to make room for letters in the cortex, s...

This highlight has been truncated due to consecutive passage length restrictions.

 

We first made this observation in literate and illiterate adults, but we quickly replicated our results in children who were learning to read.44 As soon as a child begins to read, the visual word form area begins to respond in the left hemisphere. Meanwhile its symmetrical counterpart, in the right hemisphere, strengthens its response to faces

 

Very slowly, the representation of faces changed: as the children became more and more literate, face responses increased in the right hemisphere, in direct proportion to reading scores.

 

So, does literacy lead to a knockout or a blockade of the cortex? Our experiments suggest the latter: learning to read blocks the growth of face-recognition areas in the left hemisphere.

 

The expansion of one stops the other—and because letters settle down in the left hemisphere, which is dominant for language, faces have no choice but to move to the right side.

 

All infants are genial linguists: as early as eighteen months of age, they easily acquire ten to twenty words a day—but only if they are spoken to.

 

For instance, in children who are read bedtime stories every evening, the brain circuits for spoken language are stronger than in other toddlers—and the strengthened cortical pathways are precisely those that will later allow them to understand texts and formulate complex thoughts.

 

During evolution, four major functions appeared that maximized the speed with which we extracted information from our environment. I call them the four pillars of learning, because each of them plays an essential role in the stability of our mental constructions:

 

These pillars are: Attention, which amplifies the information we focus on. Active engagement, an algorithm also called “curiosity,” which encourages our brain to ceaselessly test new hypotheses. Error feedback, which compares our predictions with reality and corrects our models of the world. Consolidation, which renders what we have learned fully automated and involves sleep as a key component.

 

This is why every student should learn to pay attention—and also why teachers should pay more attention to attention! If students don’t attend to the right information, it is quite unlikely that they will learn anything. A teacher’s greatest talent consists of constantly channeling and capturing children’s attention in order to properly guide them.

 

Alerting, which indicates when to attend, and adapts our level of vigilance. Orienting, which signals what to attend to, and amplifies any object of interest. Executive attention, which decides how to process the attended information, selects the processes that are relevant to a given task, and controls their execution.

 

Parents and teachers complain that today’s children, plugged into computers, tablets, consoles, and other devices, constantly zap from one activity to the next and have lost the capacity to concentrate—but this is untrue. Far from reducing our ability to concentrate, video games can actually increase it.

 

The founding father of American psychology, William James (1842–1910), in his The Principles of Psychology (1890), best defined this function of attention: “Millions of items of the outward order are present to my senses which never properly enter into my experience. Why? Because they have no interest for me. My experience is what I agree to attend to. Only those items which I notice shape my mind.”

 

“The art of paying attention, the great art,” says the philosopher Alain (1868–1951), “supposes the art of not paying attention, which is the royal art.”

 

I have already told you about experiments where babies are taught the meaning of a new word, such as “wog.” If the infants can follow the speaker’s gaze toward the so-called wog, they have no trouble learning this word in just a few trials—but if wog is repeatedly emitted by a loudspeaker, in direct relation to the same object, no learning occurs. The same goes for learning phonetic categories: a nine-month-old American child who interacts with a Chinese nanny for only a few weeks acquires Chinese phonemes—but if he receives exactly the same amount of linguistic stimulation from a very high-quality video, no learning occurs.

 

It is not only eye contact that matters: children also quickly understand the communicative intention that lies behind the act of pointing with a finger (whereas chimpanzees never really understand this gesture).

 

But Homo sapiens’ dependency on social communication and education is as much of a curse as it is a gift. On the flip side of the coin, it is education’s fault that religious myths and fake news propagate so easily in human societies. From the earliest age, our brains trustfully absorb the tales we are told, whether they are true or false. In a social context, our brains lower their guard; we stop acting like budding scientists and become mindless lemmings. This can be good—as when we trust the knowledge of our science teachers, and thus avoid having to replicate every experiment since Galileo’s time! But it can also be detrimental, as when we collectively propagate an unreliable piece of “wisdom” inherited from our forebears. It is on this basis that doctors foolishly practiced bloodletting and cupping therapies for centuries, without ever testing their actual impact. (In case you are wondering, both are actually harmful in the vast majority of diseases.) 

 

In adulthood, this social conformism persists and grows. Even the most trivial of our perceptual decisions, such as judging the length of a line, are influenced by social context: when our neighbors come to a different conclusion than us, we frequently revise our judgment to align it with theirs, even when their answer seems implausible.47 In such cases, the social animal in us overrides the rational beast.

 

“Do I dare set forth here,” writes Rousseau in Emile, or On Education, “the most important, the most useful rule of all education? It is not to save time, but to squander it.” For Rousseau and his successors, it is always better to let children discover for themselves and build their own knowledge, even if it implies that they might waste hours tinkering and exploring. . . . This time is never lost, Rousseau believed, because it eventually yields autonomous minds, capable not only of thinking for themselves but also of solving real problems, rather than passively receiving knowledge and spitting out rote and ready-made solutions. “Teach your student to observe the phenomena of nature,” says Rousseau, “and you will soon rouse his curiosity; but if you want his curiosity to grow, do not be in too great a hurry to satisfy it. Lay the problems before him and let him solve them himself.”

 

I have no special talent. I am only passionately curious. Albert Einstein (1952)

 

Curiosity is therefore a force that encourages us to explore. From this perspective, it resembles the drive for food or sexual partners, except that it is motivated by an intangible value: the acquisition of information.

 

Forward blocking provides one of the most spectacular refutations of the associationist view.5 In blocking experiments, an animal is given two sensory clues, say, a bell and a light, both of which predict the imminent arrival of food. The trick is to present them sequentially. We start with the light: the animal learns that whenever the light is on, it predicts the arrival of food. Only then do we introduce dual trials where both light and bell predict food. Finally, we test the effect of the bell alone. Surprise: it has no effect whatsoever! Upon hearing the bell, the animal does not salivate; it seems utterly oblivious to the repeated association between the bell and the food reward. What happened? The finding is incompatible with associationism, but it fits perfectly with the Rescorla-Wagner theory. The key idea is that the acquisition of the first association (light and food) blocked the second one (bell and food). Why? Because the prediction based on light alone suffices to explain everything.

 

No surprise, no learning: this basic rule now seems to have been validated in all kinds of organisms—including young children. Remember that surprise is one of the basic indicators of babies’ early skills: they stare longer at any display that magically presents them with surprising events that violate the laws of physics, arithmetic, probability, or psychology (see figure on this page and figure 5 in the color insert). But children do not just stare every time they are surprised: they demonstrably learn.

 

Let’s start with an elementary example: Imagine hearing a series of identical notes, A A A A A. Each note elicits a response in the auditory areas of your brain—but as the notes repeat, those responses progressively decrease. This is called “adaptation,” a deceptively simple phenomenon that shows that your brain is learning to predict the next event. Suddenly, the note changes: A A A A A#. Your primary auditory cortex immediately shows a strong surprise reaction: not only does the adaptation fade away, but additional neurons begin to vigorously fire in response to the unexpected sound. And it is not just repetition that leads to adaptation: what matters is whether the notes are predictable. For instance, if you hear an alternating set of notes, such as A B A B A, your brain gets used to this alternation, and the activity in your auditory areas again decreases. This time, however, it is an unexpected repetition, such as A B A B B, that triggers a surprise response.

 

The auditory cortex seems to perform a simple calculation: it uses the recent past to predict the future. As soon as a note or a group of notes repeats, this region concludes that it will continue to do so in the future. This is useful because it keeps us from paying too much attention to boring, predictable signals. Any sound that repeats is squashed at the input side, because its incoming activity is canceled by an accurate prediction. As long as the input sensory signal matches the prediction that the brain generates, the difference is zero, and no error signal gets propagated to higher-level brain regions. Subtracting the prediction shuts down the incoming inputs—but only as long as they are predictable. Any sound that violates our brain’s expectations, on the contrary, is amplified. Thus, the simple circuit of the auditory cortex acts as a filter: it transmits to the higher levels of the cortex only the surprising and unpredictable information which it cannot explain by itself.

 

Whatever input a brain region cannot explain is therefore passed on to the next level, which then attempts to make sense of it. We may conceive of the cortex as a massive hierarchy of predictive systems, each of which tries to explain the inputs and exchanges the remaining error messages with the others, in the hope that they may do a better job.

 

Take, for example, the following sentence: “I prefer to eat with a fork and a camel.” Your brain has just generated an N400 wave, an error signal evoked by a word or an image which is incompatible with the preceding context.11 As its name suggests, this is a negative response that occurs at about four hundred milliseconds after the anomaly and arises from neuronal populations of the left temporal cortex that are sensitive to word meaning. On the other hand, Broca’s area in the inferior prefrontal cortex reacts to errors of syntax, when the brain predicts a certain category of word and receives another,12 as in the following sentence: “Don’t hesitate to take your whenever medication you feel sick.” This time, just after the unexpected word “whenever,” the areas of your brain that specialize in syntax emitted a negative wave immediately followed by a P600 wave—a positive peak that occurs around six hundred milliseconds. This response indicates that your brain detected a grammar error and is trying to repair it.

 

Thanks to this predictive learning mechanism, arbitrary signals can become the bearers of reward and trigger a dopamine response. This secondary reward effect has been demonstrated with money in humans and with the mere sight of a syringe in drug addicts. In both cases, the brain anticipates future rewards. As we saw in the first chapter, such a predictive signal is extremely useful for learning, because it allows the system to criticize itself and to foresee the success or failure of an action without having to wait for external confirmation.

 

So, let us spare them this distress and give them the most neutral and informative feedback possible. Error feedback should not be confused with punishment.

 

According to learning theory, a grade is just a reward (or punishment!) signal. However, one of its obvious shortcomings is that it is totally lacking in precision. The grade of an exam is usually just a simple sum—and as such, it summarizes different sources of errors without distinguishing them. It is therefore insufficiently informative: by itself, it says nothing about the reason why we made a mistake, or how to correct ourselves. In the most extreme case, an F that stays an F provides zero information, only the clear social stigma of incompetence.

 

Not only are they imprecise, but they are also often delayed by several weeks, at which point most students have long forgotten which aspects of their inner reasoning misled them.

 

Grades can also be profoundly unfair, especially for students who are unable to keep up, because the level of the exams usually increases from week to week. Let’s take the analogy of video games. When you discover a new game, you initially have no idea how to progress effectively. Above all, you don’t want to be constantly reminded of how bad you are! That’s why video game designers start with extremely easy levels, where you are almost sure to win. Very gradually, the difficulty increases and, with it, the risk of failure and frustration—but programmers know how to mitigate this by mixing the easy with the difficult, and by leaving you free to retry the same level as many times as you need. You see your score steadily increase . . . and finally, the joyous day comes when you successfully pass the final level, where you were stuck for so long. Now compare this with the report cards of “bad” students: they start the year off with a bad grade, and instead of motivating them by letting them take the same test again until they pass, the teacher gives them a new exercise every week, almost always beyond their abilities. Week after week, their “score” hovers around zero. In the video game market, such a design would be a complete disaster. All too often, schools use grades as punishments.

 

We cannot ignore the tremendous negative effects that bad grades have on the emotional systems of the brain: discouragement, stigmatization, feelings of helplessness. . . . Let us listen to the insightful voice of a professional dunce: Daniel Pennac, today a leading French writer who received the famous Renaudot Prize in 20...

This highlight has been truncated due to consecutive passage length restrictions.

 

One group was told to spend all their time studying, in eight short sessions. A second group received six sessions of studying, interrupted by two tests. Finally, the third group alternated four brief study sessions and four tests. Because all three groups had the same amount of time, testing actually reduced the time available for studying. Yet the results were clear: forty-eight hours later, the students’ memory of the word list was better the more opportunities they had to test themselves. Regularly alternating periods of studying and testing forced them to engage and receive explicit feedback (“I know this word now, but it’s this other one I can never remember . . .”). Such self-awareness, or “meta-memory,” is useful because it allows the learner to focus harder on the difficult items during the subsequent study sessions.21 The effect is clear: the more you test yourself, the better you remember what you have to learn.

 

After a few seconds or minutes, working memory already starts dissipating, and after a few days, the effect becomes enormous: unless you retest your knowledge, memory vanishes. To get information into long-term memory, it is essential to study the material, then test yourself, rather than spend all your time studying.

 

It’s easy to put these ideas into practice on your own. All you have to do is prepare flash cards: on one side, you write a question, and on the other, the answer. To test yourself, draw the cards one after the other, and for each card, try to remember the answer (prediction) before checking it by turning to the other side (error feedback). If you get the wrong answer, put the card back toward the top of the pile—this will force you to revisit the same information soon. If you get the right answer, put the card at the bottom of the pile: there is no immediate need to study it again, but it will reappear sooner or later, at a time when forgetting will have begun to take effect. There are now many phone and tablet apps that allow you to build your own collection of flash cards, and a similar algorithm underlies learning software, such as the famous Duolingo for foreign languages.

 

THE GOLDEN RULE: SPACING OUT THE LEARNING Why does the alternation of studying and testing have such positive effects? Because it exploits one of the most effective strategies that educational science has discovered: the spacing out of training sessions. This is the golden rule: it is always better to spread out the training periods rather than cram them into a single run. The best way to ensure retention in the long term is with a series of study periods, interspersed with tests and spaced at increasingly large intervals.

 

The rule is simple, and all musicians know it: fifteen minutes of work every day of the week is better than two hours on a single day per week.

 

What is the most effective time interval between two repetitions of the same lesson? A strong improvement is observed when the interval reaches twenty-four hours—probably because sleep, as we will see in a moment, plays a central role in consolidating what we learn.

 

The rule of thumb is to review the information at intervals of approximately 20 percent of the desired memory duration—for instance, rehearse after two months if you want a memory to last about ten months.

 

Experiments show that it is better to mix all sorts of different problems, instead of limiting oneself to the most recent lesson, in order to regularly put all of one’s knowledge to the test.

 

Short-term exams, which focus only on what was learned in the preceding weeks, do not guarantee long-term memory. A cumulative review, covering the entire program from the beginning of the year, works much better.

 

While our brain’s central executive is focused on one task, all other conscious decisions are delayed or canceled. Thus, as long as a mental operation remains effortful, because it has not yet been automated by overlearning, it absorbs valuable executive attention resources and prevents us from focusing on anything else. Consolidation is essential because it makes our precious brain resources available for other purposes.

 

The result was clear: what we learn in the morning fades away with time, according to Ebbinghaus’s exponential law; what is learned at midnight, on the other hand, remains stable over time (provided the students had at least two hours of sleep). In other words, sleeping prevents forgetting.

 

Do not underestimate children. At birth, infants possess a rich set of core skills and knowledge. Object concepts, number sense, a knack for languages, knowledge of people and their intentions . . . so many brain modules are already present in young children, and these foundational skills will later be recycled in physics, mathematics, language, and philosophy classes. Let us take advantage of children’s early intuitions: each word and symbol that they learn, however abstract, must connect to prior knowledge. This connection is what will give them meaning.

 

Take advantage of the brain’s sensitive periods. In the first years of life, billions of synapses are created and destroyed every day. This effervescent activity makes the child’s brain particularly receptive, especially for language learning. We should expose children to a second language as early as possible. We should also bear in mind that plasticity extends at least until adolescence. During this entire period, foreign language immersion can transform the brain.

 

Enrich the environment. Learning wise, the child’s brain is the most powerful of supercomputers. We should respect it by providing it with the right data at an early age: word or construction games, stories, puzzles. . . . Let’s not hesitate to hold serious talks with our children, to answer their questions, even the most difficult, using an elaborate vocabulary, and to explain to them what we understand of the world. By giving our little ones an enrich...

This highlight has been truncated due to consecutive passage length restrictions.

 

Rescind the idea that all children are different. The idea that each of us has a distinct learning style is a myth. Brain imaging shows that we all rely on very similar brain circuits and learning rules. The brain circuits for reading and mathematics are the same in each of us, give or take a few millimeters—even in blind children. We all face similar hurdles in learning, and the same teaching methods can surmount them. Individual differences, when they exist, lie more in children’s extant knowledge, motivation, and the rate at which they learn. Let’s carefully determine each child’s current level in order to sele...

 

Pay attention to attention. Attention is the gateway to learning: virtually no information will be memorized if it has not previously been amplified by attention and awareness. Teachers should become masters at capturing their students’ attention and directing it to what matters. This implies carefully getting rid of any source of distraction: overly illustrated textbooks and excessively de...

This highlight has been truncated due to consecutive passage length restrictions.

 

Keep children active, curious, engaged, and autonomous. Passive students do not learn much. Make them more active. Engage their intelligence so that their minds sparkle with curiosity and constantly generate new hypotheses. But do not expect them to discover...

This highlight has been truncated due to consecutive passage length restrictions.

 

Make every school day enjoyable. Reward circuits are essential modulators of brain plasticity. Activate them by rewarding every effort and making every hour of class fun. No child is insensitive to material rewards—but their social brains respond equally to smiles and encouragement. The feeling of being appreciated and the awareness of one’s own progress are rewards in and of themsel...

This highlight has been truncated due to consecutive passage length restrictions.

 

Encourage efforts. A pleasurable school experience is not synonymous with “effortless.” On the contrary, the most interesting things to learn—reading, math, or playing an instrument—require years of practice. The belief that everything comes easy can lead children to think that they are dunces if they do not succeed. Explain to them that all students must try har...

This highlight has been truncated due to consecutive passage length restrictions.

 

Help students deepen their thinking. The deeper our brain processes information, the better we can remember. Never be content with superficial learning; always aim for deeper understanding. And remember Henry Roediger’s words: “Making learning conditions more difficult, thus requiring s...

This highlight has been truncated due to consecutive passage length restrictions.

 

Set clear learning objectives. Students learn best when the purpose of learning is clearly stated to them and when they can see that everything at their disposal converges toward that purpose. Clearly explain ...

This highlight has been truncated due to consecutive passage length restrictions.

 

Accept and correct mistakes. To update their mental models, our brain areas must exchange error messages. Error is therefore the very condition of learning. Let us not punish errors, but correct them quickly, by giving children detailed but stress-free feedback. According to the Education Endowment Foundation’s synthesis, the quality of the feedbac...

This highlight has been truncated due to consecutive passage length restrictions.

 

Practice regularly. One-shot learning is not enough—children need to consolidate what they have learned to render it automatic, unconscious, and reflexive. Such routinization frees up our prefrontal and parietal circuits, allowing them to attend to other activities. The most effective strategy is to space out learning: a little bit every day. Spac...

This highlight has been truncated due to consecutive passage length restrictions.

 

Let students sleep. Sleep is an essential ingredient of our learning algorithm. Our brain benefits each time we sleep, even when we nap. So, let us make sure that our children sleep long and deep. To get the most out of our brain’s unconscious night work, studying a lesson or rereading a problem just before falling asleep can be a nifty ...

This highlight has been truncated due to consecutive passage length restrictions.

 

All children would probably benefit from knowing the four pillars of learning: attention, active engagement, error feedback, and consolidation. Four slogans effectively summarize them: “Fully concentrate,” “participate in class,” “learn from your mistakes,” and “practice every day, take advantage of every night.” These are very simple messages that we should all heed.

_020220215N.jpg
bottom of page