It’s hilarious that it’s taken me 32 years to learn just how blissfully unaware I was of my backend.
No, not my ass, but the stuff that goes on in my subconscious.
Between 9th grade (when I started enjoying learning) and completing college, I was living under the impression that soaking up knowledge, binge-watching powerful documentaries and reading important books brought me closer to my higher self.
Little did I know that was only occasionally true
In his book, Thinking, Fast and Slow, Daniel Kahneman explains that our conscious mind is in charge of analytical thinking.
But it’s also sometimes busy, mostly starving, and often lazy.
And despite believing otherwise, we’re mostly running on autopilot, following the rules made by our subconscious.
After that realisation, you can’t continue living the same way, can you?
Not when you’re determined to get better at #adulting. And I believe that a big part of being an adult has better judgement.
That’s especially important when you’re responsible for making decisions that can impact the careers of hundreds of people
Our judgement is seriously fickle.
And remarkably so when based on memories or when we’re under pressure.
It’s safe to say that most management positions check both boxes.
Here are a couple of heuristics that are especially harmful to decision making at work.
Finding causality without evidence
Kahneman explains that our subconscious keeps track of what’s ‘normal’ by linking events, actions and outcomes that mostly happen simultaneously.
I’m allergic to tomatoes, and after several decades of bloating and nausea, I associate tomatoes with digestive issues.
Associative memory (tomatoes = nausea) bypass critical thinking, leading me to find causality where it doesn’t exist.
It means that I’m more likely to believe that tomato soup is a horrible food without ever having tried it.
How many animals did Moses take into his ark?
Two of each kind, right?
According to Kahneman, most people skip to the answer without noticing the error in the question itself.
Moses didn’t build the ark. Noah did.
But when you know the story and remember that Moses and Noah are biblical characters, your subconscious skips the critical thinking bit.
This happens all the time, and we’re mostly oblivious to it.
We analyse to confirm what we already believe, not to disprove it
We like it when we’re right. And most of the time, we’re only looking for information that proves we’re right.
It’s called Positive Test Strategy, and it’s how our conscious mind works.
When uncertain, we like to bet on an answer that’s most likely to be true.
Our intuition (or gut feeling) is often correct.
And that’s where things get tricky because even the best of us have only limited experiences in life.
So, when making sense of anything new, we work backwards, starting with comparing what we don’t know with what we do know.
Despite our best efforts, our experiences are limited.
And relying on associative memory alone leads us to confirmation bias and the halo effect.
We don’t need much information before jumping to conclusions.
I also learnt of it from Thinking, Fast and Slow.
Here’s Alan and Ben, what do you think of them?
Alan: Intelligent – Industrious – Impulsive – Critical – Stubborn – Envious
Ben: Envious – Stubborn – Critical – Impulsive – Industrious – Intelligent
In most experiments, people believed that Alan’s stubbornness is justified because he’s an intelligent person. But Ben’s intelligence makes him dangerous because he’s envious and stubborn.
Knowing nothing more than simply reading an identical list of adjectives, first in one order and then in another, is enough to convince us that Ben is a Bond villain.
Associative memory creates a story that’s safe and familiar. But it doesn’t care much for the accuracy of data available to us.
And worse, an essential feature of associative memory is that it can only use what we remember. Anything else may as well not exist.
Our shared biases multiply our lapse in judgement
We are our biases. There’s no running away from it.
And our biases multiply when we’re working together.
That’s why they don’t interview witnesses to a crime at the same time in the same place.
As Kahneman explains, “Allowing the observers to influence each other effectively reduces the size of the sample, and with it, the precision of the group estimate.”
We feel confident when we believe in the consistency of the stories we tell ourselves.
And our biases add the illusion of completeness by plastering over the gaps.
But that’s not how we solve problems at work
At work, we solve problems in lengthy meetings with half distracted people working with incomplete data.
We can’t leave our biases at the door, but we can keep them in check.
One way to do that is by asking people to write down pre-meeting memos.
It can be a summary of their point of view on an important issue before its openly discussed.
Asking your colleagues to share their thoughts in a meeting before a decision needs to be made doesn’t remove their personal biases, but it prevents Group Think from highjacking your meeting.