AskDwightHow.org 365/24/7
THE 14:24 GUEST HOUSE
14m 24s


We'll get your problem solved one way or the other. Open this door



Where do we get out maps of what's out there and how things work?
​
Where do we get our life maps? We get them from our parents and other family members, from our peers, and from our teachers. We get them from our philosophers, our ministers, our chiropractor, and our doctor. We get them from our media, our icons, and our bad guys. We get them from our peers, family, neighbors, media, advertisements, YouTube, and from gossip. We get them from our language. In short, we get them from our culture and sub-culture and everything in environment, both internal and external, that is interpreted as "factual" in some way.
​
We also get them from the thoughts that just happen to occur to us, like, "I have to do this," with no further questioning about the validity of that thought.
​
All these maps come mostly ready-made. Many maps contradict other maps. And it's pretty easy to find how many maps and map systems have internal contradictions, where they say or suggest two different things, but both cannot be correct.
​​
​
Maps (beliefs) come into style and go out of style
Maps go in and out of style, often having little relationship to what anyone actually has or had any evidence for. An example of a belief that has "gone out of style" that dominated people lives, especially in Europe between 1400 and 1517, was the idea of indulgences. The set of interlocking beliefs that supported the idea of paying indulgences to the Catholic Church were these.
​
1. What people who were believers, which was almost everyone at that time, about their dead ancestors or other important people to them was this. I'll use the example of that dead person being your father.
-
After death, unless he was already a saint, the father’s soul was thought to be in purgatory.
-
Purgatory was imagined as a place of fire, suffering, and purification — not eternal damnation, but temporary punishment to cleanse the soul before heaven.
-
The length and intensity of this suffering were proportional to the sins not fully atoned for in life.
2. What an Indulgence believed to do for the father.
-
An indulgence was believed to remove some or all of the temporal punishment due for sin.
-
If the indulgence was applied to the father’s soul, then:
-
Without indulgence: the father might have to spend, say, decades or centuries in purgatory fire, suffering intensely.
-
With indulgence: the father’s “sentence” would be reduced — perhaps shortened by years, or even canceled altogether if the indulgence was “plenary” (full).
-
In practice, many preachers and sellers simplified the message: “Pay this sum, and your father’s soul will be freed today.”
-
3. How People Imagined the experience
-
They thought indulgences had a real, immediate effect on the father’s condition:
-
He would suffer less, or for less time.
-
In the most optimistic view (especially with a plenary indulgence), the father’s soul might be released instantly from purgatory and enter heaven.
-
-
Indulgence preachers often dramatized this. Some (like Johann Tetzel) told people that the very moment the coin was paid, the soul of their loved one would “fly out of purgatory.”
4. The Emotional Payoff
-
For the living son/daughter:
-
Relief from guilt (“I can do something for my father after death”).
-
Hope that their father was spared torment.
-
-
For the father (in their belief):
-
Immediate easing of pain and sooner entry into the joy of heaven.
-
​
​
The arrogance of thinking that our map of reality does not need serious and constant questioning
​
We look back on the past (or even those in our own time whose beliefs about what is true are substantially different than our own) and believe that somehow we (or the people with similar beliefs to ours) have really got it right this time.
​
​
"Don't dare change this map!"
​
Many maps, as part of the map, create taboos about questioning the map. Maps often try to protect themselves. They attempt to prevent competition by villainizing other maps or map systems. Almost all maps rely to some extent on the confirmation bias to keep itself from "changing too much."
​
It is a rare map that encourages the questioning of itself. Case in point. I had a rare mother when she always encouraged me to think for myself (in other words, part of her world map was, at least to some extent was to encourage the questioning of maps), even if it led to me questioning some of her maps, for example, she believed that "children should have time to play" and "persistence is a good thing." Although it may have made her uncomfortable, she respected when I took issue with some aspect of her maps.
​
​
Our most important map is one that encourages to question our maps
​
​​
Good science as well as good engineering is like that because it is grounded in requiring evidence, constant questioning and challenging, and providing non-contradictory reasoning in order to have maps that are congruent enough with how things are and how they work so that we are most likely get be able to guide our thoughts and behaviors in living best here on earth.
​
​
How did we develop maps before the idea of science developed to help with good map development?
​
1. Authority and Tradition
-
People accepted beliefs handed down by elders, rulers, priests, or teachers.
-
Ancient societies leaned on the wisdom of ancestors or revered texts (e.g., the Vedas, the Torah, Homer’s epics).
-
Tradition often carried more weight than new observation, since continuity was seen as proof of truth.
2. Revelation and Religion
-
Many truths were considered revealed by gods, spirits, or divine messengers.
-
Prophets, shamans, oracles, and priests acted as mediators of knowledge.
-
Religious texts or rituals provided explanations for natural events, morality, and human destiny.
3. Philosophical Reasoning (without empirical testing)
-
Especially in Greece, India, and China, intellectuals sought truth through logic, dialectic, and speculation.
-
Belief could rest on coherence with philosophical systems (e.g., Confucian harmony, Platonic ideals).
-
Argument and rhetoric often carried more weight than experiment.
4. Consensus and Custom
-
Communities often assumed “if everyone believes it, it must be true.”
-
Shared myths, taboos, and collective practices reinforced common truths.
-
Culture acted as the validating framework: beliefs were true if they fit the accepted worldview.
5. Personal Experience and Intuition
-
Dreams, visions, and subjective experiences were treated as reliable knowledge.
-
Intuition and “common sense” (as framed by culture) were accepted guides.
-
Folk wisdom and proverbs often codified these insights.
6. Magic, Superstition, and Symbolism
-
Belief in omens, astrology, charms, and ritual actions as ways of knowing or influencing reality.
-
Symbols (e.g., numbers, animals, stars) were thought to carry intrinsic truth.
-
Causality was often inferred from coincidence or tradition (e.g., if a rooster crows and the sun rises, the rooster must cause it).
​
​
Given the grip of cognitive biases, it's amazing that the idea of science and engineering ever developed
​
Because the stickiness of pre-scientific belief systems isn’t just about lack of tools; it’s about how the human brain naturally works. Many of our built-in cognitive biases made those ways of deciding “feel” right, even when they weren’t reliable. Here are the most obvious ones:
​
1. Authority Bias
-
People tend to believe what leaders, priests, kings, or elders say simply because of their status.
-
Example: A decree from a king or pronouncement by a priest was accepted as truth without question.
2. Tradition/Status Quo Bias
-
A strong preference for the familiar and inherited ways of doing things.
-
Beliefs handed down for generations felt safer and truer than new ideas.
3. Confirmation Bias
-
Tendency to notice and remember information that supports what one already believes, and to ignore contradictions.
-
Example: If a ritual was followed and rain came, people remembered the “success” but forgot all the times it failed.
4. Post Hoc Fallacy (Illusory Causation)
-
Assuming that because event B followed event A, A must have caused B.
-
Example: Sacrificing to a god before battle and then winning = belief that the sacrifice caused the victory.
5. Appeal to Consensus (Bandwagon Effect)
-
If most people believe something, individuals feel it must be true.
-
This reinforced shared myths and collective religious stories.
6. Patternicity (Apophenia)
-
Humans are wired to see patterns, even in randomness.
-
Example: Seeing constellations as meaningful figures, interpreting comets as divine messages.
7. Anthropomorphism
-
Attributing human intentions and emotions to natural forces.
-
Storms, rivers, or stars were believed to act with purpose, like people or gods.
8. Narrative Bias
-
Preference for stories that give meaning and coherence over raw, messy facts.
-
Myths, parables, and sacred histories explained why the world was the way it was — more satisfying than “we don’t know.”
9. Ingroup Bias/Tribalism
-
Trusting the knowledge of one’s group or culture more than outsiders.
-
New or foreign ideas were often rejected not on evidence, but on the source.
10. Availability Heuristic
-
Judging truth by what is most vivid, memorable, or recently experienced.
-
If someone dreamed of their dead father and then something happened, the dream seemed prophetic.
​
​
The basic ideas of science and good engineering are missing in our everyday life
​
In reviewing each of the above ten cognitive biases and reflecting of the everyday lives of my clients and friends, especially in areas of their lives when they have stress, anxiety, upset, and other forms of suffering, it is crystal clear how one or more of these biases was an essential component in causing that suffering. ​​
​
And, even in myself I can see some of these biases sneaking in and making things more difficult for me.
​
​
If we were using good science and engineering principles in our everyday life, then...
​
Whenever we felt any suffering or upset, we will have developed the habit of asking ourselves questions like these:
​
Clarify the Problem
-
What exactly is the problem I’m facing?
-
What assumptions am I making about it?
-
What would “success” or “resolution” look like for me?
Look at the Facts
-
What do I actually know for sure, and what am I uncertain about?
-
What are the real facts versus my interpretations, fears, or guesses?
-
What information could I gather (from experience, others, or data) to see this more clearly?
Explore Possibilities
-
What are the different ways of looking at this situation?
-
What signs might show me that my current perspective could be wrong?
-
What are some other options or approaches I could try?
Check Myself
-
Where might my own habits, emotions, or biases be shaping how I see this?
-
What blind spots or sources of error might be making the problem seem bigger or different than it is?
Weigh the Wider Impact
-
Who or what else is affected by how I handle this?
-
What risks, costs, or unintended results might come with my choices?
-
How does this connect to my deeper values or long-term goals?
Final Reality Check
-
If I turn out to be wrong, how would I discover that — and am I ready to adjust?
​
​
Developing better personal maps of our reality
​
In the areas of science and engineering, although the maps are always generally improving because they keep questioning them and noticing evidence that doesn't seem to fit with the current maps, I, as well as most of us, feel much more secure in trusting the maps of science and engineering, than we would the maps that others think is true in their own personal lives and the world around them.
I am talking about beliefs (maps) that people think are facts that, depending upon the person include beliefs like,
"I'm unlucky,"
"I just know this project is going to be a success,"
"There is no way my wife and I would ever get divorced,"
"There are no good jobs out there,"
"It's their fault,"
"You just can't trust people,"
"She betrayed me,"
"People like that are impossible to deal with,"
"I should try harder,"
"Stranger, Danger,"
"It's the ignorant, selfish, greedy people who are ruining this world,"
"It is so obvious they are wrong,"
"There is no way she's say yes if I asked her out,"
"Life is hard,"
"The end of the world is coming soon,"
"There are just things I have to do,"
"Helping others and caring for others is what makes you a good person,"
"No pain, no gain,"
"Persistence is everything,"
and thousands of others.
​
Scientists and engineers insist on precisely defined terms, terms that are clear enough so that whatever is being referenced has clear criteria of know when something is true or not, like "These gold nuggets weight 25.7 grams." A scientist/engineer will clearly know ways to determine to what extent or not the item being weighed or not is "gold," because gold has been clearly distinguished as well as how a "gram" of something can be distinguished.
​
Most of the above assertions (maps/beliefs) don't even get off the ground in order. In the science of logic such seeming assertions would be classified as semantically ill-formed or meaningless.
​
A valid response (although maybe not a welcome one) would be “That’s not even a meaningful proposition, so it can’t be true or false.”
You might go on to explain, "Only truth-apt sentences — those that can, in principle, correspond to a state of affairs — can be true or false. A semantically ill-formed sentence lacks the conditions for truth-value assignment."
​
Or in one of the less complicated seemingly valid assertions like, "I'm unlucky," you might ask,
-
"What specific circumstances did you not get what you wanted or intended?
-
"By 'unlucky' are you implying that something outside of your influence was instrumental in you not getting what you wanted or intended?
-
"The word 'unlucky' has a pejorative connotation. Are you also asserting that something is wrong about you not getting what you wanted or intended? If so, what evidence do you have for that?"
​
So many possible but unclear distinction could be wrapped up inside the word "lucky" inside the seeming assertion "I'm unlucky."
Without getting clear about what those are in any sort of understandable, even semi-distinct way so that we're dealing with assertions that qualify as a truth-apt sentence or assertion. Only such sentences have any chance of being evaluated as either true or false.
Let's consider another of the above assertions that appears to be a truth-apt asserton:
"I just know this project is going to be a success."
​
Assuming it's clear what "this project means" and it is also clear of what evidence we would have at some point in the future that it was either a "success" or "not a success," by an understood completion date.
​
​
An upset waiting to happen
​
Someone who speaks something like this is likely indulging in an expectation, not just an intention or predictions based upon certain actions taking place, but an expectation in the sense that there will be something wrong with himself or herself, or something wrong with another or others, or even something wrong with reality if the project does not successful as specified.
​
​
A very poor map, indeed
​
Not only that, they are implying by saying that the "know" that is is something like 99% or 100% sure, not just that, giving the limitations of their knowledge and the unpredictable nature of many things in the world, including the reliability of the information that person is relying on to be accurate, including taking into account the cognitive biases (overconfidence bias, optimism bias, confirmation bias, illusion of control, anchoring bias, hindsight bias (preemptive form), faith in intuition/affect heuristic, and groupthink) that are likely to be at play in making this assertion, it likely that it is very close to false, making it a very poor mind map to negotiate his or her world with.
​
​
​
​
​
​
​
​
​










