I recently completed the marathon that is “Thinking, Fast and Slow” by Daniel Kahneman, which got me thinking about bias. Throughout the book, Kahneman explores our propensity to rely on thinking shortcuts (heuristics) to make decisions. This habit minimizes effort on the part of our frontal lobe, and he distinguishes this approach v. conscious executive thinking with the terms “System 1” and “System 2,” respectively.
When presented with a decision, many of us recognize our default thought as, “how can I answer this based on what I already know?” We’ve adapted a preference for efficiency and rely on System 1 to mitigate System 2’s cognitive load. In a world where speed is paramount, we can come to conclusions quickly.
Whether we land on the right conclusion often has more to do with our awareness of the limitations of heuristics – and willingness to challenge assumptions – than our good intentions. Kahneman’s book goes on to detail study after study demonstrating how wrong we are when applying System 1 thinking to a System 2 problem.
This is where it gets interesting: When it comes to DEI, we also know that relying on our habitualized thoughts really means depending on the messaging and socialized hierarchies from the last few hundred years (aka not our best work). These messages teach us to value maleness, whiteness, able-bodied, cis-gendered straightness, and a whole host of unconscious social norms.
To account for the flawed nature of our System 1 default, we’ve established everything from flight readiness checklists in the cockpit to systematic head-to-toe scans of newborns by multiple nurses. If we agree that we must force System 2 thinking for high-stakes decisions in these rigorous professions, then it should be no insult to apply the same objectivity to matters of diversity and inclusion. Huzzah! Right?
This may be old news, but man, is it worth repeating: We need DEI process reviews, programs, and goals that force us to evaluate our potential blind spots and reduce bias in our everyday decision-making. It’s just too easy to say “the interviewee wasn’t a culture fit” when what we’re actually responding to is our internalized, broken rulers.
I once heard a speaker say, “people are like bent carts,” – calling to mind everyday struggles with shopping carts slowed by faulty wheels, flapping wildly and fighting our every effort to go the intended direction. When it comes to our habits, biases, and System 1 reactions, we have to apply the same sustained correction we would apply to a bent cart if we are ever going to reach our more equitable and inclusive destination.
This work begins by acknowledging our bent towards exclusion, judgment, and all of the faulty, flapping heuristics we apply to each other. And then we’ve got to accept that it comes with some cognitive load – some friction cost to efficiency and speed – to create the more just world we desire.