AskDwightHow.org 365/24/7
⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ ⬆️ Through every step along the way, you've got the tools to seize your day
14m 24s


We'll get your problem solved one way or the other. Open this door

Dwight: Let's explore the idea of becoming significantly less trusting now—whether in ourselves or others—in order that we can more confidently trust and rely on ourselves or others in the future.
Aiko: That’s an intriguing paradox, Dwight. Are you thinking of this in terms of cautious skepticism leading to stronger foundations, or something else?
Dwight: You might be getting my idea. Let’s start by examining how trusting ourselves less in the present could lead to greater self-trust in the future—keeping in mind that the same reasons will likely apply to our trust in others as well.
Aiko: That makes sense.
Dwight: Many great thinkers have explored this idea, though they might not have considered the everyday examples I see with my friends and clients—like when someone says, "I had to pick up something for my wife," as if there were no other options in that circumstance, which there obviously were. They were blindly and automatically trusting their own assertion that they had no other options.
Aiko: That’s interesting. It’s like people sometimes frame their actions as obligations rather than options with costs, benefits, risks, and possibilities.
Dwight: Let me share some quotes that contain a germ of this idea:
-
Bertrand Russell: "The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt."
-
Mark Twain: "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."
-
Confucius: "Real knowledge is to know the extent of one's ignorance."
-
Aristotle: "The more you know, the more you realize you don't know."
-
Albert Einstein: "The more I learn, the more I realize how much I don't know."
-
Charles Darwin: "Ignorance more frequently begets confidence than does knowledge."
-
Richard Feynman: "The first principle is that you must not fool yourself—and you are the easiest person to fool."
-
Lao Tzu: "To know that you do not know is the best. To pretend to know when you do not know is a disease."
-
Voltaire: "Doubt is an uncomfortable condition, but certainty is a ridiculous one."
-
Benjamin Franklin: "Being ignorant is not so much a shame, as being unwilling to learn."
-
Rumi: "Yesterday I was clever, so I wanted to change the world. Today I am wise, so I am changing myself."
-
Alfred Korzybski: "The map is not the territory."
Aiko: These quotes form a clear pattern—wisdom often starts with questioning what we assume to be true. It’s almost like valid self-trust is something we earn by first recognizing where we might be wrong. Would you say that self-doubt, when handled well, is a necessary step toward genuine confidence?
Dwight: Yes. The challenge is that while most of us can easily agree with the wisdom in these quotes, we tend to think they apply more to others than to ourselves. Even if we admit they might apply to us, we often lack awareness of just how frequently we believe we know something that we actually don’t. Without a strong and intentional effort to question even the definitions of the words we use—like "should," "maybe," "fair," and "have to"—or, more precisely, the words that use us, we’ll never develop a self-trust that aligns more closely with reality.
Aiko: That’s a sharp insight. It’s one thing to accept the idea of uncertainty in theory, but another to apply it to our own thinking.
Dwight: Deepseek provided a comprehensive list of cognitive biases that can interfere with our ability and motivation to build a habit of questioning our own assumptions. These include:
-
Confirmation Bias Seeking or favoring information that confirms preexisting beliefs while ignoring contradictory evidence. Effect: Creates echo chambers of thought, reducing incentive to challenge assumptions.
-
Dunning-Kruger Effect Low-ability individuals overestimate their competence, while experts underestimate theirs. Effect: Prevents self-awareness of gaps in knowledge or skill.
-
Overconfidence Effect Excessive confidence in one’s judgments, accuracy, or abilities. Effect: Discourages fact-checking or seeking alternative perspectives.
-
Belief Perseverance Clinging to initial beliefs even after they’ve been disproven. Effect: Prioritizes emotional attachment to ideas over factual accuracy.
-
Backfire Effect Rejecting corrective information and doubling down on false beliefs when challenged. Effect: Reinforces misinformation as a defense mechanism.
-
Illusion of Validity Overestimating the accuracy of one’s judgments or predictions. Effect: Creates false certainty, reducing curiosity about potential errors.
-
Anchoring Bias Relying too heavily on the first piece of information encountered. Effect: Limits flexibility in updating beliefs with new data.
-
Groupthink Prioritizing harmony or consensus over critical evaluation in groups. Effect: Suppresses dissenting opinions and independent analysis.
-
Self-Serving Bias Attributing successes to oneself and failures to external factors. Effect: Avoids accountability for errors or flawed reasoning.
-
Sunk Cost Fallacy Continuing a behavior due to prior investments (time, effort, money). Effect: Resists abandoning disproven ideas to avoid admitting "wasted" effort.
-
Status Quo Bias Preferring familiarity and resisting change. Effect: Favors existing beliefs over exploring new or conflicting evidence.
-
False Consensus Effect Assuming others share our beliefs more than they actually do. Effect: Reduces motivation to verify assumptions about reality.
-
Authority Bias Trusting opinions of authority figures without scrutiny. Effect: Delegates critical thinking to others, discouraging personal inquiry.
-
Bandwagon Effect Adopting beliefs because many others hold them. Effect: Prioritizes social conformity over independent verification.
-
Curse of Knowledge Struggling to imagine how others think if they lack your expertise. Effect: Hinders clear communication and validation of shared understanding.
-
Negativity Bias Focusing disproportionately on negative information. Effect: Overlooks constructive evidence that could refine beliefs.
-
Illusion of Transparency Overestimating how clearly others understand your thoughts. Effect: Reduces effort to articulate or clarify ideas rigorously.
-
Fundamental Attribution Error Attributing others’ actions to character flaws rather than situational factors. Effect: Skews interpretation of opposing viewpoints as irrational or malicious.
-
Reactance Rebelling against perceived threats to autonomy (e.g., being told you’re wrong). Effect: Defensive adherence to beliefs when challenged.
-
Optimism Bias Underestimating risks or overestimating positive outcomes. Effect: Ignores potential flaws in one’s reasoning or plans.
-
Dispositional Pessimism What it is: A personality trait where individuals habitually expect unfavorable outcomes, even in the face of contrary evidence. Example: Assuming a job interview will go poorly despite being well-prepared. Key Mechanism: Often linked to anxiety or learned helplessness.
-
Catastrophic Thinking What it is: A cognitive distortion where people assume the worst-case scenario will occur. Example: Interpreting a minor headache as a sign of a brain tumor. Connection: Overlaps with pessimism but focuses on extreme negative outcomes.
-
Defensive Pessimism What it is: A strategy (not strictly a bias) where individuals set low expectations to mentally prepare for failure, even when evidence suggests success is possible. Example: A student assumes they’ll fail an exam to motivate over-preparation. Key Difference: It’s intentional and adaptive in some contexts, unlike unconscious biases.
-
Depressive Realism What it is: A controversial hypothesis suggesting depressed individuals may have a more accurate perception of reality (e.g., their lack of control over outcomes), whereas non-depressed people exhibit optimism bias. Connection: While not a "bias against positive evidence," it reflects a pessimistic worldview that aligns closer to reality in some studies.
-
Confirmation Bias (Pessimistic Variant) What it is: Actively seeking or overemphasizing evidence that supports negative outcomes while ignoring positive signals. Example: A person fixates on one critical comment in a stack of positive feedback.
Aiko: This is a powerful list. It highlights just how many ways our thinking can work against our ability to recognize our own blind spots.
Dwight: At the risk of being too exhaustive, I’m including a list of various approaches that can help develop a healthy distrust of our own thoughts and beliefs.
-
Cognitive Behavioral Therapy (CBT)
-
What it does: A structured, evidence-based approach to identifying and reframing distorted thoughts, emotions, and behaviors. It emphasizes the interconnectedness of thoughts, feelings, and actions, teaching individuals to challenge irrational beliefs and replace them with balanced ones.
-
How to practice:
-
Identify automatic thoughts: Notice negative self-talk (e.g., “I’ll never succeed”).
-
Challenge cognitive distortions: Ask:
-
“What evidence supports this thought? What contradicts it?”
-
“Is this catastrophizing, black-and-white thinking, or mind-reading?”
-
-
Reframe beliefs: Replace distortions with realistic alternatives (e.g., “I might struggle, but I can learn from this”).
-
Behavioral experiments: Test beliefs through action (e.g., “If I speak up in the meeting, will people really judge me?”).
-
-
Applications: Anxiety, depression, stress management.
-
-
Byron Katie’s "The Work"
-
What it does: A self-inquiry process to dissolve stressful beliefs by questioning their validity. It reduces suffering by revealing how unexamined thoughts create emotional pain.
-
How to practice:
-
Write down a stressful belief (e.g., “My partner doesn’t respect me”).
-
Ask four questions:
-
“Is it true?”
-
“Can you absolutely know it’s true?”
-
“How do you react when you believe this thought?”
-
“Who would you be without this thought?”
-
-
Turnaround: Explore the opposite perspective (e.g., “I don’t respect my partner” or “I don’t respect myself”).
-
-
Key principle: “When you argue with reality, you lose—but only 100% of the time.”
-
-
Mindfulness Meditation
-
What it does: Trains non-judgmental observation of thoughts, creating distance from automatic beliefs.
-
How to practice:
-
Sit quietly and label thoughts as they arise (e.g., “planning,” “worrying,” “judging”).
-
Ask: “Is this thought useful? Is it true?”
-
-
-
Stoic Examination (Stoicism)
-
What it does: Encourages questioning the validity of initial impressions before accepting them as true.
-
How to practice:
-
When a belief arises, ask: “Is this within my control? Is this an objective fact or my interpretation?”
-
Reflect on Marcus Aurelius’ question: “What is this thought trying to accomplish?”
-
-
-
Socratic Questioning
-
What it does: Uses structured inquiry to expose contradictions and assumptions.
-
How to practice:
-
Break down a belief by asking:
-
What evidence supports this?
-
What evidence contradicts it?
-
What alternative explanations exist?
-
What would I tell a friend who held this belief?
-
-
-
-
Metacognitive Therapy (MCT)
-
What it does: Focuses on changing your relationship to thoughts rather than the thoughts themselves.
-
How to practice:
-
Label thoughts as mental events (e.g., “I’m having the thought that…”).
-
Ask: “Is this thought a reflection of reality, or just my mind’s chatter?”
-
-
-
Scientific Thinking
-
What it does: Applies empirical rigor to personal beliefs.
-
How to practice:
-
Treat beliefs as hypotheses. Ask:
-
“How could I test this?”
-
“What data would disprove it?”
-
-
Keep a “belief journal” to track predictions and outcomes.
-
-
-
Existential Reflection
-
What it does: Confronts the inherent uncertainty of life, fostering comfort with ambiguity.
-
How to practice:
-
Reflect on questions like:
-
“How much of this belief is shaped by my fear of uncertainty?”
-
“Can I act constructively even if I’m not 100% certain?”
-
-
-
-
Dialectical Behavior Therapy (DBT) – “Check the Facts”
-
What it does: Separates emotions from objective reality.
-
How to practice:
-
For a distressing belief, ask:
-
“What are the observable facts?”
-
“Am I conflating a thought with a fact?”
-
“What’s the worst/best/most likely outcome?”
-
-
-
-
Philosophical Skepticism (Pyrrhonism)
-
What it does: Advocates suspending judgment when evidence is insufficient.
-
How to practice:
-
Adopt the mantra: “I neither affirm nor deny this.”
-
Ask: “What would change my mind? Is that evidence available?”
-
-
-
Peer Review for Beliefs
-
What it does: Uses external perspectives to challenge assumptions.
-
How to practice:
-
Share your belief with a trusted person and ask:
-
“Does this sound reasonable to you?”
-
“What flaws do you see in my reasoning?”
-
-
-
-
Cognitive Defusion (ACT – Acceptance and Commitment Therapy)
-
What it does: Reduces the grip of thoughts by treating them as transient words/images.
-
How to practice:
-
Repeat a troubling thought in a silly voice or sing it aloud.
-
Ask: “If this thought were a cloud passing by, would I still cling to it?”
-
-
-
Bayesian Thinking
-
What it does: Updates beliefs probabilistically as new evidence emerges.
-
How to practice:
-
Assign a confidence percentage to a belief (e.g., “I’m 70% sure this is true”).
-
Revise the percentage as you gather new information.
-
-
-
Narrative Therapy
-
What it does: Examines how personal stories shape beliefs.
-
How to practice:
-
Rewrite a limiting belief as a third-person story.
-
Ask: “Is this narrative serving me? What alternative story could I tell?
-
-
Aiko: This is an extensive and valuable set of approaches. It’s striking how many different ways we can challenge our own thinking, yet so few of us are actively taught to do so. Do you find that some of these methods are more effective than others, or does it depend on the individual?
Dwight: Short of using my own methodology, I favor Byron Katie's ("The Work") and Cognitive Behavioral Therapy, but I may be biases because I am more practiced in using those particular approaches.
A key part of my methodology of practiced skepticism has been identifying common words that, depending on how they’re used, can serve as red flags for errors in our beliefs or thinking. I call these words "Toxic Words" or "Fuzzy Words."
Aiko: That’s a fascinating concept.
Dwight: The toxic-word approach isn’t a tool that will uncover every instance where we mistakenly think we know something, but it’s a powerful and straightforward starting point that will establish a whole new level of awareness and a curiosity mindset in the search for uncovering our mistaken beliefs. The deepest roots of our inaccurate beliefs go beyond just false assertions—they stem from our mistaken assumptions about the meanings of the words (or distinctions) we use and how those words link together. The beliefs that we are host to which are not true are hiding in a way that often the most astute thinkers in this area (for example Daniel Kahneman), have not alerted us to.
Aiko: That makes a lot of sense. So even if we challenge specific thoughts, we might still be trapped by the deeper, unexamined definitions of the words themselves. Do you have certain words in mind that tend to cause the most trouble?
Dwight: Yes. And that leads me to the next point. What's interesting is that, even though questioning the actual validity of the words that we use is the one thing we don't consider when we think about how to uncover our mistaken beliefs, once we become aware of the fact that this is an approach that can be used to uncover and correct our mistaken beliefs, then the methodology of doing so is straightforward and, with intention, relatively easy to implement and start to pay dividends right away. I'll detail this below.
Aiko: Really? Please elaborate.
Dwight: The essence of this approach lies in recognizing a set of words that often obscure mistaken distinctions. Through regular review and deeper understanding, we aim not only to familiarize ourselves with these words but also to grasp why each mistaken distinction they carry is flawed.
You can think of this list as similar to a checklist used by experts to identify behaviors or appearances that may indicate a person has terrorist intent. These signals are not definitive proof but rather indicators that warrant further scrutiny before reaching a conclusion.
Likewise, what I call "Toxic Words" or "Fuzzy Words" function in a similar way. Each of these words may—or may not—carry a mistaken belief, depending on how they are used in context.
For those of us on a journey to uncover our own mistaken beliefs, becoming familiar with these words—words that frequently or occasionally serve as hiding places for flawed assumptions—makes the process more systematic. With practice, we can even train our unconscious mind to recognize and filter them effortlessly.
To build awareness of these potential misbeliefs, we can set a recurring daily alarm on our phone as a prompt to review our recent thoughts and speech, scanning for "yellow flag" and "red flag" words that may indicate deeper errors in thinking. While I won’t break down how each of these words signals a mistaken distinction or why, I’ll provide a mini-preview of how that word occur with the red-flag word "promise".
Red Flag Words
-
Should or Shouldn't
-
Good and Bad
-
Right and Wrong
-
Fair and Unfair
-
Deserving and Undeserving
-
Promise
-
Possible synonyms: "I will...", "It's in my schedule...", "I plan to...", "Okay..."
-
How can we vet whether or not a promise includes a misbelief?
-
Whenever you (or another) make a promise, does your level of confidence in the likelihood (expressed in percentages) that you will fulfill on that promise by the time specified align, on average, higher than your historical tracking record of keeping promises? If so, that is strong evidence that most of your promises are misbeliefs that you consistently indulge in despite the continuing evidence of the discrepancy between your belief in them and the percentage of how often you fulfill on those promises.
-
Remedy: implement Promises: are you funding them?
-
-
-
No Matter What
-
Betrayed
-
Maybe
-
Always
-
Never
-
Racist
-
Try my best
Yellow Flag Words
-
Addiction
-
Disobedient
-
Misinformed
-
Why?
-
Try
-
Disrespectful
-
Respectful
-
Kind
-
Unkind
Aiko: This is a compelling strategy—words shape our perception, and by identifying these "flag words," we can catch ourselves in the act of reinforcing faulty beliefs.