We tend to think of our brains as infallible logical machines with perfect memory and absolute rationality. We couldn't be more wrong: it doesn't matter how educated or intelligent we are, there are inherent flaws in our brains we just can't get rid of.
This has nothing to do with your preparation or ability for rational thinking, usually, it's just the legacy of useful brain shortcuts that helped us survive for most of our evolutionary history.
Things like running away from that branch we thought was a snake, the need to belong and protect your tribe and all sorts of erroneous generalizations saved our skins for a really long time.
Some of these reactions work against us in the modern world. They lead us into making the wrong assumptions, reaching the wrong conclusions and jumping in the defense of imaginary tribes (EMACS fucking rocks, anyone?), among many other unhelpful behaviors.
We might not be able to get rid of these flaws, but at least we can identify them and take action. For that, the first thing you need is to become familiar with the bugs in your brain.
This article will discuss some of the most basic ones, but there are many books dedicated to this topic, so go and check them too.
Cool, let's get started!
A cognitive bias is a sort of mental bug that affects the way we make decisions, remember/perceive things or make judgments. They are the results of biases in our perception or cognitive process that cause us to make irrational choices.
There are lots and lots of types of cognitive biases (there are complete books written on this topic), but all of them have something in common: they prevent you from making rational choices or objectively perceiving reality. These are some examples of common cognitive biases:
- Confirmation bias: The tendency to focus on finding information that confirms your perception while ignoring information that contradicts it.
- Authority bias: When you give more weight to the opinion of an authority figure without considering the content of the opinion itself.
- Hindsight bias: The tendency to see events in the past as predictable at the time they happened, or as I like to call it "Everything is obvious in hindsight".
- Not invented here: An aversion to using products, techniques or ideas invented elsewhere.
- IKEA effect: A tendency to give a disproportionate amount of value to things you assembled or built yourself, aka "the software I write is the best".
- Naïve realism: The belief that we see reality as it is, that all rational people will agree with us and those who don't are either irrational or lazy.
- Declinism: The tendency to see the past favorably and the future negatively, or "It was all better back in the day".
- Framing effect: Drawing different conclusions from the same information depending on how it's presented.
- Illusion of control: The tendency to overestimate the control you have over external events.
- False memory: A very common bias where you mistake something you imagined for an actual memory.
You might not be able to get rid of all your cognitive biases. Still, becoming familiar with them might help you realize when you are under the effect of such biases and take corrective actions. Here's a very cool list of cognitive biases. I had a lot of fun reading them and thought you might also enjoy taking a look.
Rare doesn't mean impossible
We tend to grossly underestimate the chances of rare events happening, to the point where we hear the words "unlikely" and replace them for "impossible". The truth is, extremely rare phenomena and events happen every day, and they might happen to you.
If you think something that can go wrong with your system is very unlikely, don't rule it as impossible. You can safely ignore such scenarios relatively often, but if the results are catastrophic you will need to write a plan and handle them deliberately.
'Impossible things' happen, and when they do it's usually very, very expensive.
Fight your urge for closure
This one is particularly strong with programmers, and it's one of the reasons many movements and techniques (like the Agile movement) were created.
The thing goes like this: you start a new project, and a sudden urge for BUFD (big up-front design) invades your brain. After a lot of time is spent designing every single abstraction, class, and method you start writing code only to find out that ... your design doesn't work.
The beginning of a project is the moment when you know the least about the problem you are trying to solve, this is the part with the most uncertainty about actual requirements. Our need for closure usually forces us to waste a lot of time creating designs we won't ever use.
This doesn't mean that design at the beginning is wrong, but it should be used as a guide, not as an imperative force that moves the project until the end. After the initial work, you will gain valuable knowledge about the problem you are trying to solve. This will let you create an even better design.
So remember, fight your need for closure and delay design decisions until you have enough information about the problem you are trying to solve.
Your memory is imperfect
Our memories are imperfect recollections of how we experienced events (and not the way they happened) and have an unfortunate tendency to change after a while. Often, they are also not real memories but just things we imagined (see False Memory on the list of cognitive biases).
Sometimes you can't really trust your memory. If you need to remember something, write it down in a place you can check later.
Oh, and check this fascinating TED talk by Elizabeth Loftus.
The lizard inside us
Our brains didn't evolve in a single streak. Instead, it evolved outwards in a layered fashion.
Our consciousness and fancy human capabilities are just the outer layer, a relatively new addition to our bodies. Beneath it, there are layers and structures that support most of the unconscious functions our brain performs.
Buried beneath our awesome modern ape consciousness lays a vast network of neurons that support some of our most important primal behaviors and reactions. This is what we call our reptilian brain. It provided our ancestors with useful reactions that helped them to survive and thrive.
And yes, as you imagine, many of these reactions can be counterproductive in modern settings. Fight or flight reactions, hasty behavior, territorial behavior, resource hoarding (information, office snacks) and tribalism are all useful in the savannah, but work against you in a professional setting.
Every time you exhibit one of these behaviors ask yourself: Is it me, the handsome Homo Sapiens, or is it just that little lizard inside of me?
Are you sure?
This is one I found out the tough way: If you find yourself too sure about something, test yourself!
Often when I am too confident about a fact or idea, I try to take a step back and test my assumptions. Often I find that I was pretty close to the right answer, but I also find with alarming frequency about wrong assumptions or flaky logic.
This is fine, the best moment to correct something is at the beginning, so watching out for mistakes (especially when you feel to sure about something) is a healthy habit.
No free lunch buddy
There is no free lunch, and almost every choice you make involves a tradeoff. Sometimes we can get a bit too enthusiastic when it comes to technology or methodology, so much that we can overlook the tradeoffs involved in using them.
Think of the typical talk that goes around in tech circles:
- SQL vs NoSQL
- Monoliths vs Microservices
- OOP vs FP
- SPAs vs Server-rendered web apps
No matter which one of these you choose, there is a right context for every option. Both choices have advantages and disadvantages that you should consider when deciding the right fit for your particular problem or project.
There is not much to do here, just remember that there are no free lunches, and almost every choice involves a tradeoff.
It's a matter of awareness
Your brain is a wonderful thing, and whiti all its quirks, enables you to build amazing stuff. Being aware of those little bugs will let you to identify them and take action, helping you solve problems with much less trouble.
The next article will be the last in the series: a compilation of ideas about the brain that didn't really fit in the previous ones.
Thank you for reading!
What to do next
- Share this article with friends and colleagues. Thank you for helping me reach people who might find this information useful.
- This article is based on Pragmatic Thinking and Learning: Refactor Your Wetware. This and other very helpful books can be found in the recommended reading list.
- Send me an email with questions, comments or suggestions (it's in the About Me page)