When Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision-Making

Posted on April 2, 2015
By Mark Cabaj

How do we make good decisions when trying to tackle tough social challenges? This has been a central concern of management experts, policy analysts and community developers for nearly 100 years.

Shortcut

A big barrier to good decision-making is managing our cognitive biases. While we may not be familiar with the term, we all have first-hand experience with mental traps and emotional triggers that shape - and often distort - how we interpret and use data to help us make sense of the world around us. For example, a community safety committee may devote scarce volunteer hours to patrolling the streets to prevent vandalism in their neighborhood simply because vandalism was on the front page of yesterday's newspaper (i.e. the "recency bias"). Or, teachers may be reluctant to let go of an old way of teaching math in order to try a new as of yet unproven approach (i.e. "loss aversion bias or status quo bias"). Or, recently you may have personally experienced the tendency to underestimate the time, energy and difficulty of getting your family in the car and to Easter celebrations at your mother's (i.e. the "confidence bias"). Wikipedia now list nearly 150 such biases.

In the last ten years, an impressive number of books have emerged that explore the role of cognitive biases in our day to day lives and how we might navigate these biases productively. Some of my favourites include: Predictably Irrational: The Hidden Forces that Shape our Decisions by Dan Ariely; Bozosapiens by Michael and Ellen Kaplan; How We Decide, by Jonah Lehrer; Nudge, by Richard Thaler and Cass Sunstein; and, the most comprehensive of them all, Thinking Fast and Slow by Daniel Kahneman.

I decided that I would write up a resource on the role of cognitive biases in community change efforts after completing my graduate research on the topic of evaluating social innovation. But alas, Tanya Beer and Julie Coffman - the prolific and experienced principals from the Center from Evaluation Innovation in Washington, DC - beat me to it.

When Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision-Making is an excellent resource that explores the role that cognitive biases play in the thinking and decisions of philanthropic organizations who invest in social innovation and community change. The authors begin by exploring why decision-makers of all kinds resort to mental shortcuts. They then describe cognitive traps that are most commonly faced by philanthropists. Some of these include:

  1. Confirmation Bias - A tendency to seek information that confirms our existing beliefs and opinions while ignoring data that challenges them (e.g. I believe that wrap-around services are the answer to helping at-risk kids finish high school, regardless of the mountain of data that suggest it can only address some of the factors that influence this outcome).
  2. Escalation of Commitment - A commitment to an idea, a direction or decision even when data suggests it is no longer worth supporting (e.g. we already invested $500,000 into this project, so let's keep going to see if we can't make it work).
  3. Availability Bias - A pattern of recalling immediate or easy-to-remember examples or incidents that relate to a discussion or decisions (e.g. I attended a board meeting once where the members discussed how much time staff was spent on reporting to different funders for little pots of money - that is why I think we really should find a way to align our reporting requirements with other funders).
  4. Groupthink - When the urge for harmony and consensus in a group limits their authentic appraisal of alternative ideas or viewpoints (e.g. I suspect that our shift to funding a few larger mainstream agencies to tackle this problem will cause a lot of friction and trouble for smaller - but more culturally-responsive - agencies, but everyone seems to be ok with this and I don't want to rock the boat with my colleagues).

The authors then recommend eleven techniques that funders can employ to balance the unique cognitive biases about their own strategy and investments:

  1. Use Devil's Advocacy
  2. Invite an Outsider's Perspective
  3. Look for Disconfirming Evidence and Ask for the Bad News
  4. Focus on Trends Rather than Individual Experiences
  5. Remind Yourself What You Do Not Know
  6. Play Out Alternative Perspectives and Solutions
  7. Build Forward Estimations into Processes
  8. Encourage Course Corrections
  9. Develop Decision Teams That Include More Than the Original Decision-Makers
  10. Build Earlier Check-ins into the Strategy Approval Process
  11. Reduce Upfront Strategy Planning Time in Favor of Ongoing Strategy Development

Learn more about these techniques in the paper.

Like all Center for Evaluation Innovation resources, When Shortcuts Cut Us Short is to the point, easy to read, and full of practical ideas. More importantly, it shines a light on one of the most significant, poorly-understood and often deliberately ignored challenges for would be social innovators: the necessity of slowing down, rather than speeding up, when wresting with complex issues.

Topics:
Community Innovation


Mark Cabaj

By Mark Cabaj

Mark is President of the consulting company From Here to There and an Associate of Tamarack. Mark has first-hand knowledge of using evaluation as a policy maker, philanthropist, and activist, and has played a big role in promoting the merging practice of developmental evaluation in Canada. Mark is currently focused on how diverse organizations and communities work together to tackle complex issues, on social innovation as a "sub-scene" of community change work, and on strategic learning and evaluation.

Related Posts

BACK TO THE LATEST