Sage Solutions
Advice and insight about personal growth, personal development, and becoming your best self.
Sage Solutions
Cognitive Biases: Part 1
Ever feel certain you’re right… until the facts don’t move the needle? We unpack the hidden architecture of your mind—the cognitive biases that color every argument, purchase, scroll, and self-judgment—and show how small shifts in awareness can change everything. Rather than treating your brain like a camera or computer, we reframe it as a painter, filling in gaps with familiar colors: confirmation bias and my-side bias select the evidence that flatters us, while desirability bias makes wishful thinking feel like truth.
We dive into confident humility by naming the traps that inflate or deflate our self-perception. The spotlight effect fuels insecurity by convincing us everyone notices our flaws. The objectivity illusion whispers that we’re less biased than “those people.” And the Dunning–Kruger effect explains why knowing a little can make us dangerously sure, while deeper expertise restores healthy doubt. You’ll learn how to stress-test your certainty, invite disconfirming evidence, and keep curiosity alive without surrendering conviction.
Fear-based biases also skew our map of reality. Negativity bias makes bad news sticky; declinism and conservatism bias keep us nostalgic for a past that never fully existed. Then there’s the anchoring bias, where first numbers and first impressions frame value and character, and pessimism aversion—the ostrich move—where we avoid hard truths about money, health, or the planet. We translate these concepts into action with a one-week game: label biases as they arise, question anchors before deciding, and turn overwhelming headlines into concrete, right-sized steps.
By the end, you’ll have a practical toolkit to spot distortions, lower the temperature of debates, and make cleaner choices under uncertainty. Share this with a friend who loves mindset upgrades, and if it helped you think clearer, subscribe, leave a review, and tell us: which bias did you catch first?
We would love to hear your feedback! Click here to tell us what you think.
https://sagesolutions.buzzsprout.com
If you are interested in one-on-one coaching, email us at:
sagecoachingsolutions@gmail.com
**Legal Disclaimer**
The Sage Solutions Podcast and content posted by David Sage is presented solely for general informational, educational, and entertainment purposes. No coaching client relationship is formed by listening to this podcast. No Legal, Medical or Financial advice is being given. The use of information on this podcast or materials linked from this podcast or website is at the user's own risk. It is not intended as a substitute for the advice, diagnosis, or treatment of a psychotherapist, physician, professional coach, Lawyer or other qualified professional. Users should not disregard or delay in obtaining medical advice for any medical or mental health condition they may have and should seek the assistance of their healthcare professionals for any such conditions. The opinions of guests are their own and may not necessarily reflect the opinions of the podcast.
Welcome to the Sage Solutions podcast, where we talk about all things personal growth, personal development, and becoming your best self. My name is David Sage, and I am a self-worth and confidence coach with Sage Coaching Solutions. I want to start with a question, but I'm so glad that you're here today. Have you ever been in an argument with someone, maybe a partner, a coworker, or just someone in the comment section on Facebook or YouTube? And you find yourself thinking, how can they possibly not see the facts? Like the evidence is right there. You look at them and they seem like they're just delusional. Now this might be a little extreme, but we all get caught up in our own perspective. And sometimes it truly feels that way. But the uncomfortable truth that I want us to wrestle with today is that a lot of times, they are likely looking at you and thinking the exact same thing. We like to think that our brains are like cameras. We think that we just point our eyes at the world, record the footage, and store the data. And that the way we think is always logical and computational, like a computer or a calculator. But that's just not how human psychology works. Your brain isn't a camera. And it's not a computer. In a lot of ways, it's more of a painter. Remember your perspective of reality is the largest driver of your experience of reality, dramatically changing how you see the world and interpret events. So your brain is often this painter, and it's almost constantly painting over the world, over reality, in little cognitive shortcuts or biases that helped us survive as a species, and not just helped you survive, but made you feel safer, smarter, and more consistent than you actually are. This episode has been a long time coming. Today we are doing a deep dive into the operating system of your mind. We are talking about cognitive biases. But before we get into it, our goal with this podcast is to share free, helpful tools with you and anyone you know who is looking to improve their life. So take action, subscribe, and share this podcast with them. Okay, now don't tune out. I know this might sound like a boring psychology 101 lecture, but I've come to believe that understanding and gaining awareness of our cognitive biases is one of the best possible ways to try and view things as objectively as you can. Awareness helps you confront them. Some people may say ignorance is bliss, but in this situation I disagree. If you don't know the glitches in your own software, you can't run your own programs effectively. Now there's too many cognitive biases for me to put all of them in one episode, so I'm going to be doing a part two to this episode. But today we're going to cover ten major cognitive biases. Biases that we all have built into us as human beings. We're going to talk about why you probably think you're a better driver than you actually are. Why you think the world is getting worse when it might not be. And why the first price you see on a car determines everything. So take a deep breath and let's do our best to get objective. Part one. Let's start with the biases that are designed to protect your ego and your worldview. I'm grouping these up and calling them the ego protection squad to make it easier to remember. The captain of this squad is the one you've likely heard of, confirmation bias. Now, you know how this works. You have a belief. Let's say you believe that, I don't know, that waking up at 4 a.m. is the only way to be successful. Because of confirmation bias, you will subconsciously scour the internet and your daily life for any and all information that supports that belief. It also means that you're much more likely to ignore any information that goes counter to that belief. So let's say somebody then challenges you and says, no, you don't have to get up at 4 a.m. and provides a counterexample of somebody who's very successful that doesn't. You decide to scour the internet to see what it really says. But because of the confirmation bias, as you're doing your research, you're going to subconsciously pick out and internalize every article that supports what you already believe. And at the same time, you'll mostly ignore the hundreds of articles about the importance of sleep, or examples of very effective CEOs who wake up at eight. This automatic filtering of reality, just to prove yourself right, that you're not even doing consciously. That is confirmation bias. There's also a specific flavor of this confirmation bias called the my side bias. While confirmation bias is a bit more general, the my side bias is specifically about how we evaluate arguments. There's a scientific consensus of studies that show that we evaluate information differently based on the my side bias, especially when it comes to politics. Now I'm not looking to make a political statement here, and I'm not taking either side. I'm just using politics as an example because it's highly emotionally charged and very polarizing. Based on this scientific consensus, if I put a study in front of you that supports your political belief, you will likely accept it immediately, not questioning inconsistencies or errors or something that could make it not true. We see this all the time, where smart analytical people will then complement the positive sides, highlighting or saying that this study has a solid sample size and great methodology, overlooking any flaws in the study because it confirms what they already believe. But if I put a study in front of you that contradicts your political view, you're likely to reject it outright and say, well, there must be something wrong with it, and then start hyperfixating on the issues with it and trying to find problems with the study. We then see smart analytical people do the exact opposite. And they'll start tearing it apart, saying, Oh, the sample size is too small, and the researchers were biased. Even though that similar study earlier that confirmed their view may have had identical sample size and methodology. Because of confirmation bias and my side bias, we don't naturally evaluate data based on the data. We evaluate it based on whose side it helps. And in comes another cognitive bias that's basically holding hands with these other two. It's called the desirability bias. This is simply the tendency to believe something is true just because you want it to be true. Think about a relationship that is clearly failing. All the red flags are there. But you want the relationship to work. You want it so badly that your brain actually filters out the red flags. You don't even see them. You predict a positive outcome because the negative outcome is too painful to look at and you just don't want to believe it. If we think back to the last week of our lives, I'm sure we can find some places where any or all of these cognitive biases apply. Where did you accept a fact just because it felt good? That's desirability bias at work. It's comfortable. It keeps us stuck. And many times these compound because we often believe what we want to believe. And once we've identified that as a side, all three of these biases are at play at the same time, making it very hard to change a belief, or even just rethink one when new information is provided. This group of biases has an insidious spiral effect, causing us to be less open to new things and restricting our learning to what feels comfortable and what we already believe. By being aware of these things, we can consciously shift our perspective and try and be more open to contradicting viewpoints. This allows us to become much more well balanced, to think in more shades of gray, to be lifelong learners, to embrace wisdom and wonder through curiosity and critical thinking. Before I get into the next cluster of cognitive biases, there's one that kind of falls in between them and provides a little bit of a shade of gray with some added complexity by showing that multiple things can be true at the same time, and sometimes those things are contradictory. And with that, we have the spotlight effect. The spotlight effect is a cognitive bias where people overestimate how much others notice their appearance or behavior. This is driven by a form of egocentrism. People place too much weight on their own perspective, which makes them feel like they're under a constant spotlight, when they're often not even the focus of others' attention. This really arises from the overarching egocentric bias, which is basically that we are all caught up in our own perspective about our own life. It makes us naturally imagine things from that perspective and that people would notice the same things about ourselves that we do. Now this can go one of two ways. The most common way is actually a major driver of self-consciousness and insecurity. It's when we hyperfixate on one little part of our hair being wrong, or some specific piece of our body that we really don't like, when in reality most other people don't even notice that, but we assume everyone does, just like we do, because if we see it, how could they not? As a confidence coach, I think a lot about these and how to mitigate these while still teaching people how to have true confidence. True confidence is not about being amazing at everything. True confidence is a combination of confidence and humility, of self-awareness, self-improvement, and self-acceptance, of self-compassion and social awareness. We're trying to build the powerful effects of being confident and happy with who you are. We're trying to fill up your inner cup and help you realize that you are enough and that you do deserve to feel good and have confidence in who you are. But that doesn't mean that you're amazing at everything and that everyone thinks you're the shit. Excuse my language. But you're just one person. I'm just one person. That's why we need to balance it with humility. And these competing yet coexisting cognitive biases can drive us to both be overconfident in some areas and insecure in others. Most of the time that you're hyper-fixating on something, it's really just the spotlight effect in effect. And at the same time, the spotlight effect goes the other way too. We often walk into a room and assume that everyone is turning their heads looking. We're not that important, and that's okay. This next group of cognitive biases, I'm gonna call the ego enhancement squad. These cognitive biases can be the drivers of big egos, arrogance, and overconfidence. I mean, after all, I'm the smartest guy in the room. I also happen to be in a room with just me right now. These biases are where we overestimate our own competence. We're gonna start with something called the objectivity illusion, also known as the objectivity bias. This is one of my favorite biases because it's one of the funniest. It is the belief that you are less biased than other people. I've also heard it referred to as the I'm not biased bias. It often leads to a belief that you perceive reality objectively and without bias, and that anyone who disagrees with you must be biased, ignorant, or irrational. It's the I'm the only sane person here feeling. Not that that's never true, but are you really that special? If you walk around thinking that everyone else is crazy, you might be suffering from the objectivity bias. This leads us right into the heavy hitter, the Dunning Kruger effect. Now you may have seen the memes, but let's really define it. Dunning Kruger isn't just about dumb people who think that they're smart. It's that when you have very low competence in a specific area, you lack the ability to recognize how bad you are at it. The Dunning Kruger effect shows that people with no competence also have no confidence in their ability to do that thing. But it but as the graph progresses and somebody gains just a little bit of knowledge and competence, enough that they feel like they know significantly more than they did before, their confidence in their own knowledge and ability skyrocket well beyond where their actual competence is. This can often lead very dumb people who know a little about something to think that they're incredibly intelligent and competent at it, hence all the memes. But as the chart goes on and your knowledge increases, you have a much better understanding of the complexity of the thing, of the skill that it actually takes. So your competence increases, but your confidence actually decreases dramatically. You now understand how much you don't know. And then from that point, your competence and confidence slowly grow at a similar rate. But the Dunning Kruger effect does show that at stage two you are overconfident for your competence, and at stage three you tend to be underconfident for your competence. When you go from knowing nothing to knowing a little, you feel like you know so much more, and that confidence spikes. It's when you have that very low competence in a specific area that you lack the ability to recognize how bad you still are at it. It's the guy who reads two articles on macroeconomics and thinks he can fix the national debt. Yet oftentimes the same guy is struggling with his own finances, and he doesn't know enough to know how much he doesn't know. As you actually get smarter at something, your confidence usually drops because you realize, holy crap, this is way more complicated than I thought. If you are 100% certain about a complex topic, you should probably check yourself. You might be on the peak of what people call Mount Stupid of the Dunning Kruger effect. That's what they call the early spike of confidence in the Dun and Kruger chart. So this leads directly into the overconfidence bias and its sub-bias, the illusory superiority effect. The overconfidence bias is the tendency to overestimate your own abilities, your knowledge, or your control, leading to a skewed perception of your own skills and your performance relative to reality. Have you ever done trivia and had something pop into your head that you then decided was the answer, and you argued that you knew it with absolute certainty. Because you heard something about it once, one time, but you're certain and then it ended up being wrong, that's the overconfidence bias at play. It's a belief that we're better or smarter or more knowledgeable than we actually are, and it's usually characterized by overestimation, over precision, and finally overplacement, which leads us to the illusory superiority effect. Here's a classic statistic. If you ask a room full of people, are you an above average driver? About eighty to ninety percent of people will raise their hands. Now, mathematically, that's impossible. We can't all be above average, that's literally not how averages work. And we do this with everything. We think we are more ethical, smarter, and more logical than the average person. Now we may not think that we are the smartest, but we've got to be at least above average, right? This illusion of superiority prevents us from asking for help. It prevents us from learning. Because why would we learn if you're already better than everyone else, or at least above average? And not only that, I'm more objective than the average person, too. But by understanding your cognitive biases and becoming more aware of them, including the objectivity bias, you may actually become more objective. But you will never, I repeat, these are hardwired. You will never be fully objective. That is something that we all just have to accept. It's reality, it's human nature. So ask yourself this. In what area of your life are you coasting because you think you're naturally gifted? When maybe you actually need to put in the work. Alright, we're about halfway through. Let's move on to the cognitive biases of fear and the past. We've already covered the ones surrounding ego and arrogance. Now it's time to talk about fear. While the human brain might want to protect your ego, it is not naturally designed to make you happy. It was designed to keep you alive. And because of that, we have a massive negativity bias. Rick Hansen, a highly respected psychologist, says that the brain is Velcro for bad experience and Teflon for good ones. You could have ten people compliment your outfit today, and just one person say, Man, you look tired. And what are you going to think about when you're trying to fall asleep tonight? Most likely it's that one negative comment. We give far more weight to negative events and experiences than positive ones. Evolutionarily, this makes sense. Ignoring a sunset is fine. Ignoring a tiger is fatal. But in the modern world, this bias leads to anxiety and stress. Studies have shown that on average, we will tell about thirty three different people about a negative event, and maybe three of a positive event of similar weight. There's also data showing that we give twice the amount of attention and weight to negative news compared to neutral, and only half the amount of weight to positive news. Compound this with the fact that 80% of news stories are negative in nature because they know these cognitive biases. And you can see how this spirals out of control. Now this feeds into a related but separate bias called the declinism bias. This is the belief that the past was better than the present and that the future is going to be worse. Phrases like kids these days. Ugh, music used to be better. Society is collapsing. We've believed society is collapsing for 2,000 years. The Romans wrote about how the youth were ruining society. Declinism happens because we forget the pain of the past and we obsess over the problems of the present. This also causes us to project a linear increase in the problems of the present into the future. In most cases, when you look at the data, human well-being and quality of life have only gotten better over time. This is the whole premise behind the book The Rational Optimist. This bias stops us from seeing the opportunities right in front of us because we're too busy looking in the rear view mirror and worrying about the future. Another closely related bias is the conservatism bias. Now I don't mean politics here. In psychology, the conservatism bias is the tendency to insufficiently revise our belief when presented with new evidence. It's a form of mental inertia. Let's say you read one article saying that a specific diet is bad. Then three new high-quality studies come out showing that it's actually quite healthy. The conservatism bias is the drag that makes you say, eh, I'm still not sure. We're gonna stick with the old information because new information feels risky. This causes us to subconsciously overweigh prior information and underweigh new evidence. It keeps us living in the past, running old software on this new computer. And when you put them all together, it leaves us living in fear. Alright, we're coming down the home stretch here. I have two more for you, and these are huge for decision making. First, we have the anchoring bias. Somewhat similar to the conservatism bias. It's the tendency to rely too heavily on the very first piece of information you receive, or the anchor. You walk into a store, you see a jacket for$500, you think, that is insane. Then you see a second jacket for$200, and you think, wow, that's a deal. You only think$200 is a deal because you were anchored by that first$500 price tag. If you had walked in and the first jacket you saw was$50, you'd think the$200 jacket was a ripoff. We often do this in negotiations, in salary talks, even in how we judge people. First impressions are essentially character anchors. If someone is rude to you the first time you meet, that is the anchor. They then have to work ten times as hard to move you away from that initial data point. And finally, I want to talk about what seems like a contradictory bias, the pessimism aversion bias. Now this sounds like the opposite of the negativity bias and also kind of the declinism bias. But it's different. Negativity bias is noticing threats and giving too much weight and giving extra weight to negative information. Pessimism aversion, sometimes called the ostrich effect, is when we actively avoid looking at information that might make us feel pessimistic. Or when we hear something debilitatingly negative, in order to protect ourselves from the pain of it, especially when we have no control over it, we avoid that negative information, stick our head in the sand and pretend it's not real. An example would be hearing that we all might die from something like global warming or the AI alignment problem. Our gut reaction is to reject those claims outright and stick our head in the sand. But the ostrich effect is not just for those incredibly overwhelming examples, but also for the daily overwhelming examples. This is not checking your bank account balance because you know it's low. It's not going to the doctor to check out that weird pain because you're afraid of the diagnosis. It's the bias that makes us think if we don't see the bad news, it doesn't exist. But it's time to face the brutal truth, the facts. You cannot fix what you refuse to be aware of. Okay, so I admit there was a lot there. We basically just ran a diagnostic on your brain's biases, or at least some of them. We talked about how we filter for what we want to hear, confirmation and desirability biases. We talked about how we think we are smarter than we are, the Dunning Kruger effect, and the overconfidence bias. We talked about how we cling to the past, declinism and conservatism. We talked about how we give extra weight to negative information, negativity bias. We talked about the spotlight effect and how it amplifies some of our biggest insecurities. And finally, we talked about how we consciously avoid negative information when it feels overwhelming, with the ostrich effect, and how we get stuck on the first impression with anchoring. So what do we do with this? How do we actually use it? You cannot just flip a switch and turn these off. You are human. But you can catch them. For the next week, I want you to play a game. I'm gonna play this game with you. I want you to try and catch yourself in just one of these. When you're doom scrolling and thinking that the world is ending, say to yourself, Ah, this is declinism and the negativity bias. When you're absolutely sure you were right in an argument, ask yourself, Am I suffering from confirmation bias, desirability bias, or the objectivity bias? When you see a price tag, ask, hold on, am I being anchored? The moment you name the bias and understand it, you take away its power. The awareness of these biases helps us combat them so that we can be more objective and more aware of the forces shaping our behavior. So that's what I have to offer you today. Awareness. It's not about having a perfect brain. It's about knowing how to drive the imperfect one that you have. If you enjoyed this episode, feel free to share it with somebody else. But try not to use it to call out somebody's bias. That's not gonna work. And remember you are enough. And you Deserve to fill up your inner cup with happiness, true confidence, and resilience. Thank you for listening to the Sage Solutions podcast. Your time is valuable, and I'm so glad that you choose to learn and grow here with me. If you haven't already, don't forget to subscribe so you don't miss out on more Sage advice. One last thing, the legal language. This podcast is for educational and informational purposes only. No coaching client relationship is formed. It is not intended as a substitute for the personalized advice of a physician, professional coach, psychotherapist, or other qualified professional.