HOW OUR BRAIN IS WIRED TO FOOL US! 

By John M. Dahlen 

It’s not uncommon for us to get caught fooling ourselves from time to time. But did you know our brain is wired to fool us virtually all of the time? It’s been said that our brain is bombarded with millions of bits of data input every second of every day. When you pause to consider all of the sights, sounds, smells, and physical sensations that are incoming and to be processed at the same time that our brain is managing to make sure we breathe continuously and that our heart is beating continuously, and that all of our other activities are being successfully executed, it’s a miracle just how much the human brain is capable of. And we haven’t even touched on the complexities of social interactions and activities, creative and recreational pursuits, flying into Sun’n’Fun or Oshkosh, and so much more!  

How does our brain do it all? Well, it doesn’t! Not yours, not mine, and not anyone else’s either. As magnificent as the human brain is, it is not humanely possible for it to process and accurately utilize the massive amounts of tasks and inputs it is challenged with every minute of every day. So it is designed to cut corners and take shortcuts all the time. One set of shortcuts our brain uses, usually without us even knowing it, is called cognitive biases. Cognitive biases are shortcuts and assumptions our brain makes all the time as part of managing all of its responsibilities. These subconscious biases are one way the brain compromises accuracy for efficiency in our everyday decision making. Knowing about and understanding a bit about how they work can help guide us in our day-to-day decision-making, in our aviation activities and in our life in general. In this article, I will introduce three of the most common cognitive biases, explain a bit about how they work, and offer suggestions for how to manage them instead of them managing you. Here we go! 

Availability Bias 

One of the ways the brain takes a shortcut in decision-making is to judge the probability that an event or outcome will occur based on the memories that come to mind which are most readily available. This is called an availability bias. This greatly reduces the brain’s decision-making workload compared to the effort it would take to gather factual data and calculate the statistical probabilities of an outcome before every decision it makes. This bias frequently leads one to overestimate the likelihood that an event will occur (or not occur). As a result of this subconscious thinking shortcut, he or she is likely to decide whether or not to take an action, based on inaccurately biased perceptions.  

Watching the news media and social media platforms provide full saturation coverage of every major aviation crash leads many people to conclude that such events happen far more frequently than they actually do occur. In response to the handful of tragic major aviation crashes that occurred in rapid succession as 2024 ended and 2025 began, I have come across people who say they simply aren’t going to fly anymore until it’s safe again. While this could be great news for Amtrak in the lower 48 at a time when Amtrak could use some good news, such a personal travel decision is a good example of the effects of decision-making that is driven by an availability bias; the false but usually involuntary perception that it is no longer safe to fly because, based on the most readily available memories at this time, the perceived risk of being in a fatal crash now exceeds the benefits of flying.  

Because most of our readers are intimately engaged in general aviation and other segments of the aviation industry, you already know the error in such logic. But the current public discomfort regarding aviation safety presents us fresh opportunities to educate and offer facts/memories that people can use to regain a more reassuring perspective to offset their availability bias regarding aviation safety. So here are the most current statistics I have found. And they follow the same general trends we have all known for a long time. But it is timely to make them into new memories again. A report released by the National Safety Council in mid-March, 2025, includes their finding that there were an estimated 44,680 deaths in preventable traffic crashes in the US in 2024. That works out to an average of 122 deaths per day, every day last year on America’s roads. In comparison, The Aviation Safety Network’s database, provided by the Flight Safety Foundation, documents a total of 317 aviation deaths in the USA in 2024; or an average of less than 1 per day. Frankly, when I was earning my Psychology degree at UAA years ago, statistical math wasn’t my strong suit. (I was much more enthused by my neuro-psych and research classes, in case you can’t tell! But I digress). The point is that I have not reduced any of these statistics to a common denominator (e.g. – deaths per mile traveled, or per hours travelled, etc.} for a true apples-to-apples comparison. The disparity between the two sets of data as presented amply illustrate the continuing status of aviation as the safest mode of transportation available for when we’re going places. And it certainly offers a possible antidote to people currently experiencing an exaggerated availability bias against flying. How might a different subconscious availability bias influence your decision making in the work you do as a pilot, or mechanic, or student, or in other important areas of your life? 

Another important cognitive bias to continually manage and minimize the effects of is Inattentional Bias. In its simplest terms, this refers to the brain focusing in on one attention-grabbing item and becoming inattentive to everything else.  We can train our brain to be more selective about what it chooses to forget about when it is load-shedding in response to a crisis. Virtually every one of us pilots were taught from day one that, no matter what, the first task is to keep flying the plane until it’s on the ground. Failure to do so guarantees that you will just be the first witness to arrive at the scene of your own crash! In a crisis, or a suddenly unfamiliar situation with potentially serious outcomes possible, the human brain simply cannot process all of the masses of incoming information fast enough or clearly enough. So it subconsciously filters out and ignores much of it. By using familiar mantras like, “Aviate, Navigate, then communicate,” by practicing stall and spin recoveries repeatedly and periodically, and by staying sharp on all of the other skills and mnemonics we work to permanently etch into our memory as we continually improve our flying capabilities, we are also disciplining our brain to retain the most critical information during a crisis while it is temporarily load-shedding the rest. This is quite literally, a human survival skill. One of the most well-known examples of how deadly an inattentional bias can quickly become to a pilot is the tragic story of the Lockheed L-1011 operating as Eastern Airlines flight 401 from JFK – MIA on the night of December 29, 1972. Approaching MIA, the nose landing gear position indicator light failed to illuminate green. The flight was redirected out over the Everglades where the crew could operate in open airspace at 2,000’ to diagnose and hopefully correct the problem. Soon, all three crew members became fixated on the problem and no one was remaining attentive to flying the plane. As a result, none of them noticed when the Captain inadvertently bumped the yoke, which automatically disengaged the autopilot. The L-1011 began an uncommanded slow descent until it crashed into the Everglades, killing 101 of the 176 souls on board. This tragedy and others in the 1970’s led to the advent of crew resource management training (CRM). Today there are even some CRM training programs for solo pilots to help them be organized and prepared to manage available resources including checklists and passengers effectively during a crisis. Being mentally prepared with a clear sense of priorities in a crisis is the best antidote to prevent or reduce the impact of inattentional bias

Since the concept of cognitive biases was first recognized in the 1970’s, researchers have now identified nearly 200 different kinds of these subconscious ways our brain is continually making our workload manageable, if not always precisely accurate. Time, space, and the reader’s attention span preclude me from going into all of them! But one more very common one worth discussing is Confirmation Bias. The human brain generally leads us to gravitate to and believe information that confirms what we already believe. This type of cognitive bias has become extremely more visible and impactful on society with the evolution of social media. One only needs to read the opinions of all the self-anointed experts, and both the supporting and opposing responses from other self-anointed experts in the comments section of any posted topic to see expressions of confirmation bias in action. Both sides offer up “facts” that frequently are inaccurate or even blatantly false that they have found to support their point of view! The descriptive phrase my Dad frequently used that comes to mind was, “Often wrong, but never in doubt!” But shift gears now and think carefully about ways your own confirmation biases regarding how you fly might have the potential to create serious consequences for you when flight planning, flying, or even just listening to valuable wisdom a flight instructor might be trying to share with you. The biggest risk of an unmanaged confirmation bias is that you might someday discard some potentially life-changing, or even life-saving information simply because it doesn’t seem to align with how you see things, without even looking for ways to make it fit. The best antidote I know for managing one’s confirmation bias is to (1.) recognize and acknowledge that we all have a confirmation bias that is instrumental in helping to keep our life’s cognitive or brain workload manageable; (2.) Actively seek out a more detailed factual understanding of any information that doesn’t mesh well with your confirmation-biased picture of whatever you’re working on (e.g. – flight planning, wx briefings, preflight observations, sights sounds, smells, and the feel of everything while in flight, etc.); and (3.) as my Mom used to say when all the pieces of a puzzle in life weren’t fitting together well, “Remember, there’s more than one way to skin a cat!” Manage your confirmation bias rather than allowing it to manage you. Someday, doing so just might allow you to see an overlooked detail that might even save your life. Blue skies and tailwinds to all! 

John Dahlen is a life member of the Alaska Airmen’s Association, the US/Russia Liaison for the currently suspended AK2RU program, along-time regular writer for the Transponder, and a long-time volunteer for the Association. Please feel welcome to contact him at OneAlaskanGuy@hotmail.com with any comments on this story, and any suggestions for other topics you would like to see more about. 

Leave a Comment