How do we make good decisions when trying to tackle tough social challenges? This has been a central concern of management experts, policy analysts and community developers for nearly 100 years.
A big barrier to good decision-making is managing our cognitive biases. While we may not be familiar with the term, we all have first-hand experience with mental traps and emotional triggers that shape - and often distort - how we interpret and use data to help us make sense of the world around us. For example, a community safety committee may devote scarce volunteer hours to patrolling the streets to prevent vandalism in their neighborhood simply because vandalism was on the front page of yesterday's newspaper (i.e. the "recency bias"). Or, teachers may be reluctant to let go of an old way of teaching math in order to try a new as of yet unproven approach (i.e. "loss aversion bias or status quo bias"). Or, recently you may have personally experienced the tendency to underestimate the time, energy and difficulty of getting your family in the car and to Easter celebrations at your mother's (i.e. the "confidence bias"). Wikipedia now list nearly 150 such biases.
In the last ten years, an impressive number of books have emerged that explore the role of cognitive biases in our day to day lives and how we might navigate these biases productively. Some of my favourites include: Predictably Irrational: The Hidden Forces that Shape our Decisions by Dan Ariely; Bozosapiens by Michael and Ellen Kaplan; How We Decide, by Jonah Lehrer; Nudge, by Richard Thaler and Cass Sunstein; and, the most comprehensive of them all, Thinking Fast and Slow by Daniel Kahneman.
I decided that I would write up a resource on the role of cognitive biases in community change efforts after completing my graduate research on the topic of evaluating social innovation. But alas, Tanya Beer and Julie Coffman - the prolific and experienced principals from the Center from Evaluation Innovation in Washington, DC - beat me to it.
When Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision-Making is an excellent resource that explores the role that cognitive biases play in the thinking and decisions of philanthropic organizations who invest in social innovation and community change. The authors begin by exploring why decision-makers of all kinds resort to mental shortcuts. They then describe cognitive traps that are most commonly faced by philanthropists. Some of these include:
The authors then recommend eleven techniques that funders can employ to balance the unique cognitive biases about their own strategy and investments:
Learn more about these techniques in the paper.
Like all Center for Evaluation Innovation resources, When Shortcuts Cut Us Short is to the point, easy to read, and full of practical ideas. More importantly, it shines a light on one of the most significant, poorly-understood and often deliberately ignored challenges for would be social innovators: the necessity of slowing down, rather than speeding up, when wresting with complex issues.