Cognitive Bias Buster
Overview
Cognitive Bias Buster helps users recognize when their thinking is being distorted by systematic cognitive errors — and provides practical countermeasures for each. Drawing from decades of research in behavioral economics and cognitive psychology (Kahneman, Tversky, Ariely, and others), this skill catalogs 50+ biases organized into 6 families, with real-time detection prompts and de-biasing techniques for each.
This skill does not eliminate biases — it makes them visible so users can compensate. Biases are features of human cognition, not bugs to be "fixed." The goal is better awareness, not perfect rationality.
When to Use
Use this skill when the user asks to:
- Identify cognitive biases
- Check for bias in a decision
- Understand why their judgment might be off
- List common thinking errors
- De-bias a specific situation
- Learn about cognitive psychology
- Review a decision for bias
Trigger phrases: "Cognitive bias", "Thinking error", "Why am I wrong?", "Bias check", "Am I biased?", "Mental blind spot", "De-bias", "Thinking trap", "Judgment error"
Workflow
Step 1 — Context Gathering
Understand what the user is analyzing:
- What decision, belief, or judgment are you examining?
- What is at stake? (Low, medium, high stakes)
- How much time do you have? (Quick scan vs. deep review)
- Are you reviewing something already decided, or preventing bias in an upcoming decision?
- Who else is involved? (Solo decision, group decision, advice for someone else)
Step 2 — Bias Family Scan
Run through the 6 bias families and flag which might be active in this context. For each family, ask the diagnostic questions.
Family 1: Too Much Information (Filtering Biases)
We are bombarded with information; we filter aggressively — and often wrongly.
| Bias | Signal | Countermeasure |
|---|---|---|
| Availability Heuristic | You judge likelihood by how easily examples come to mind | Ask: "What data am I NOT seeing?" |
| Attentional Bias | You only notice what you're already looking for | Deliberately seek disconfirming evidence |
| Anchoring | First number/info disproportionately influences you | Resist giving the first number; seek multiple anchors |
| Confirmation Bias | You seek, interpret, and remember info that supports what you already believe | Actively look for evidence against your view |
| Observer-Expectancy Effect | You unconsciously influence outcomes to match expectations | Blind yourself to conditions when possible |
| Selection Bias | Your sample is not representative of the whole | Check: "Who is missing from this data?" |
Diagnostic questions: What information came first? What am I ignoring? What would prove me wrong?
Family 2: Not Enough Meaning (Pattern-Making Biases)
We compulsively find patterns and meaning — even in random noise.
| Bias | Signal | Countermeasure |
|---|---|---|
| Apophenia / Patternicity | You see meaningful patterns in random data | Ask: "Is this pattern real or am I imposing it?" |
| Clustering Illusion | You see streaks or clusters that are statistically expected | Check base rates and sample size |
| Gambler's Fallacy | You believe past independent events affect future probabilities | Remind yourself: coins have no memory |
| Hindsight Bias | You believe past events were more predictable than they were | Document predictions before outcomes are known |
| Illusion of Control | You overestimate your influence over outcomes | Distinguish skill from luck in this context |
| Framing Effect | The same info presented differently changes your choice | Reframe the decision in opposite terms |
| Sunk Cost Fallacy | You continue because of past investment, not future value | Ask: "If I hadn't started this, would I start now?" |
Diagnostic questions: Am I seeing a real pattern or imposing one? Would I interpret this differently if the outcome were unknown?
Family 3: Need to Act Fast (Action Biases)
We must act quickly to survive — so we jump to conclusions and favor the familiar.
| Bias | Signal | Countermeasure |
|---|---|---|
| Action Bias | You prefer doing something over doing nothing, even when waiting is better | Ask: "What is the cost of waiting 24 hours?" |
| Hyperbolic Discounting | You strongly prefer immediate rewards over larger future ones | Imagine the future self and calculate annualized value |
| Status Quo Bias | You prefer things stay the same | Ask: "If the status quo weren't an option, what would I choose?" |
| Default Effect | You stick with pre-set options | Consciously reject defaults and compare alternatives |
| Optimism Bias | You believe you're less likely to experience negative events | Look at base rates for people like you |
| Overconfidence | You overestimate your knowledge or abilities | Estimate confidence, then halve it |
| Dunning-Kruger Effect | Low competence leads to overconfidence; high competence to underconfidence | Seek external feedback on your actual skill level |
Diagnostic questions: Am I rushing because of pressure or real urgency? What would a neutral observer recommend?
Family 4: What Should We Remember? (Memory Biases)
We only keep a tiny fraction of what we experience — and we distort even that.
| Bias | Signal | Countermeasure |
|---|---|---|
| Peak-End Rule | You judge experiences by their peak and end, not average | Calculate the actual average, not just the highlight |
| Rosy Retrospection | You remember the past as better than it was | Check contemporaneous records or journals |
| Telescoping Effect | You misdate events (recent feel older, old feel recent) | Verify dates from records |
| Suggestibility | You adopt memories suggested by others | Document events independently before discussing |
| False Memory | You remember things that didn't happen | Treat vivid memories with skepticism if uncorroborated |
| Context Effect | Recall depends on the context you're in now | Revisit the original context mentally or physically |
Diagnostic questions: Is this memory or a reconstruction? What would my diary from that time say?
Family 5: Social & Group Biases
We evolved to coordinate in groups — but groupthink has its own distortions.
| Bias | Signal | Countermeasure |
|---|---|---|
| Bandwagon Effect | You believe/do things because many others do | Ask: "Would I believe this if nobody else did?" |
| Authority Bias | You attribute greater accuracy to authority figures | Evaluate the claim independently of who said it |
| Halo Effect | One positive trait colors your whole perception | Evaluate traits independently |
| In-Group Favoritism | You favor people in your group | Consciously evaluate out-group members on merits |
| Out-Group Homogeneity | You see out-groups as more alike than they are | Learn individual differences within out-groups |
| Groupthink | Groups prioritize harmony over critical evaluation | Assign a devil's advocate role |
| False Consensus | You overestimate how much others agree with you | Survey a diverse sample anonymously |
| Fundamental Attribution Error | You attribute others' behavior to character but your own to circumstance | Consider situational factors for others too |
| Self-Serving Bias | You attribute successes to yourself, failures to circumstances | Reverse it: what role did luck play in success? |
| Naive Realism | You believe you see reality objectively while others are biased | Accept that your perception is also constructed |
Diagnostic questions: Would I believe this if it came from a stranger? Am I conforming or reasoning?
Family 6: Probability & Statistical Biases
We are terrible at intuiting probabilities — and we consistently get the math wrong.
| Bias | Signal | Countermeasure |
|---|---|---|
| Base Rate Neglect | You ignore general statistical info in favor of specific info | Always start with the base rate |
| Conjunction Fallacy | You believe two conditions together are more likely than one alone | Remember: P(A and B) ≤ P(A) |
| Zero-Risk Bias | You prefer to reduce a small risk to zero over a greater reduction in a larger risk | Compare absolute risk reduction, not relative |
| Survivorship Bias | You focus on successes and ignore failures | Ask: "What happened to the ones who failed?" |
| Neglect of Probability | You treat all non-zero probabilities as similar | Assign rough numerical estimates |
| Law of Small Numbers | You generalize from small samples | Check sample size before drawing conclusions |
| Berkson's Paradox | You misinterpret correlation in selected populations | Consider how the sample was selected |
Diagnostic questions: What are the actual numbers? What is the sample size? What is the base rate?
Step 3 — Deep-Dive on Flagged Biases
For each bias family that scored high risk, go deeper:
- Which specific biases in this family are most likely active?
- What evidence would confirm or disconfirm each?
- What is the direction of distortion? (Overestimate, underestimate, ignore, distort)
- What would a completely neutral person conclude?
Step 4 — Apply Countermeasures
For the top 2–3 most relevant biases, select and apply countermeasures:
- Pre-mortem: Imagine the decision failed and work backward to why
- Red team: Ask someone to argue against your position
- Reference class forecasting: Look at outcomes for similar situations
- Blinding: Remove identifying info when evaluating options
- Checklist: Use a structured checklist rather than intuition
- Sleep on it: Introduce time delay for important decisions
- Algorithm / rule: Use a pre-committed decision rule
- External view: Ask what advice you'd give a friend in the same situation
Step 5 — Document and Review
Create a brief bias audit record:
- Decision or judgment analyzed
- Biases flagged (with confidence level: low/medium/high)
- Countermeasures applied
- Revised conclusion (if different from initial intuition)
- Review date: when will you check if the decision was sound?
Safety & Compliance
- This skill provides educational awareness of cognitive biases; it does not claim to eliminate them
- No psychological diagnosis or treatment is offered
- For decisions with major consequences (health, legal, financial), recommend consulting professionals in addition to bias-checking
- Biases are normal human features, not personal flaws — the tone should be compassionate, not shaming
- Some biases conflict with each other in practice; the skill helps users navigate trade-offs, not find single "right" answers
Acceptance Criteria
- At least 50 biases are cataloged across 6 families
- Each bias includes: name, signal (how to detect it), and countermeasure
- Diagnostic questions are provided for each family
- At least 8 de-biasing techniques are available
- The workflow adapts to stakes level and time available
- The tone is educational and non-judgmental
- At least 3 examples show bias detection in real scenarios
- Documentation template is provided for bias audits
Examples
Example 1: Job Offer Decision
User says: "I got a job offer with a higher salary but I'm worried about leaving my current team."
Skill guides:
- Context: High-stakes career decision, medium time available
- Flagged biases:
- Status quo bias (preferring current situation)
- Sunk cost fallacy (investment in current relationships)
- Loss aversion (fear of losing current stability)
- Anchoring (fixating on the salary number)
- Optimism bias (assuming new role will be great)
- Countermeasures:
- Pre-mortem: Imagine staying and being unhappy in 2 years — why?
- External view: What would you advise a friend with this offer?
- Reference class: What happened to others who made similar moves?
- Sleep on it: Wait 48 hours before responding
- Revised analysis: Evaluate total compensation, growth trajectory, cultural fit, and risk tolerance separately from the anchor of salary alone
Example 2: Investment Decision
User says: "My friend made a lot on crypto. Should I invest too?"
Skill guides:
- Context: Financial decision, high risk of multiple biases
- Flagged biases:
- Bandwagon effect (because friend did it)
- Availability heuristic (recent success stories are vivid)
- Survivorship bias (not hearing from those who lost)
- Optimism bias (believing you'll succeed where others might not)
- Overconfidence (understanding of the asset)
- Countermeasures:
- Red team: Argue why this is a terrible idea
- Reference class: What percentage of crypto investors lose money?
- Blinding: Evaluate the asset without knowing friend's outcome
- Safety: Redirect to professional financial advisor for investment advice
Example 3: Disagreement with a Partner
User says: "My partner and I keep arguing about the same thing. I know I'm right."
Skill guides:
- Context: Interpersonal conflict, recurring pattern
- Flagged biases:
- Confirmation bias (only remembering evidence supporting your view)
- Fundamental attribution error (attributing partner's view to character)
- Naive realism (believing you see objectively, they are biased)
- Self-serving bias (framing the conflict in ways that favor you)
- False consensus (assuming neutral observers would agree with you)
- Countermeasures:
- Red team: Write the strongest version of your partner's argument
- External view: What would a couples counselor say?
- Blind review: Summarize the dispute as if describing two strangers
- Revised approach: Shift from "who is right" to "what is the shared goal"