An emphasis on using "evidence based practices" is stifling experimentation. This was the statement I posed in a poll within my last blog, back in February 2015, just before I got sucked into a vortex of Federal grant writing from which I am only now extracting myself. The results are in and a full 77% of respondents agreed or strongly agreed that the statement is true.
Before we run out and create an "evidence based practice" of wild experimentation on the basis of this finding, however, it is important to keep two things in mind. Firstly, this was a highly unscientific poll that was not intended for grounding a new discipline but only for stimulating dialogue...which it did. Secondly, I am not really a "best practice" or "evidence-based" curmudgeon, but I am not an uncritical fan of them. On some days, I may even be more critic than fan.
In fact, we need "best practices" and "evidence-based practices." I was particularly taken by the comments of my friend Andy Penziner who offered this defense of evidence-based practices in a comment on my blog at www.nonprofitgp.com:
First, evidence-based solutions/best practices would seem preferable to pet solutions or random practices. Second, context and generalizability should always be acknowledged and considered. Third, a creative, open mind should never be stifled in favor of blind deference to whatever the best practice d'jour might be; they can coexist. Finally, as for pleasing funders or conforming to their priorities...well, it's kind of a fact of life, eh!
I would like to add two additional points to these. One is that there are some situations in which "evidence based practices" are the best and only practices you absolutely want. For example, do want to see a doctor that is not using evidence based medicine in providing care for you? Probably not. Do you want to live in a high rise building that has not been built to the standards of evidence based architecture and building construction? No way! Do you want fly down the highway in heavy traffic inside an automobile that has not been built to evidence based standards and carefully tested? Absolutely not. Keep in mind that my previous blog was a bit of a rant about using "best practices" and "evidence-based" practices to address complex social problems. A complex social problem is one that eludes solutions proposed by "best practices" and "evidence-based" solutions because it shares the characteristics of a complex adaptive system. It is dynamic; has many interdependent agents or factors; one change in the system affects changes throughout the whole system; and it is robust in its ability to do all of these things. Within complex social problems, there may be a place to use some "best practice" or "evidence-based" interventions for very specific purposes. However, to believe that one or two or even three or four "evidence-based" interventions can solve the whole of the problem is just wrong thinking. It is also to commit the error Andy warns about: failing to acknowledge the role of context.
The other thing I would like to add to Andy's list is that "best practices" and "evidence-based practices" also have useful historical value. They tell us what did and did not work well in the past, which may have value for our current situation. Considered in this light, "best practices" and "evidence-based practices" can suggest to us "better practices that may work" though they offer no guarantees of working in our situation. I bristle against "best practices" and "evidence-based practices" when they are presented as the "solution" regardless of the context, which, in the case of social problems, is usually complex.
I have become increasingly fond of the idea of "better practices that may work." This allows me to feel comfortable standing in both the worlds of "evidence-based" practice and "what if" experimentation. On the one hand, it allows me to consider the evidence of proven and best practices. On the other, as Andy indicates, it helps me to keep a creative, open mind; always consider the context; and avoid uncritically adopting the evidence-based practice of the moment.
The key word in the phrase "better practices that may work" is "may." "May" does not offer the guarantees of "will." To say something may work is to say just as clearly that it may not work which is a loaded proposition for many folks.
It is loaded with the risk of failure. It is loaded with the humility required to admit that one does not have all the answers. It is loaded with the requirement to engage in the uncertainty, angst, and, some would say, joy and excitement, of "what if" experimentation.
Over the past few months I have been compiling some "what if" experiments with regard to community engagement on complex social problems and have been discussing them and exploring their implications with increasing regularity with my clients. If you work with communities to address such problems, here are a few of my questions to help you think of your own:
- What if...people with lived experience of the social problem we are trying to address were really welcomed into our coalitions, leadership teams, and other planning groups? (As my friend Tommy Ross has said, "There is a big difference between an invitation and a welcome.")
- What if...that welcome included having the same decision making power as the rest of us?
- What if...we valued and prioritized relationship building and social networking as community engagement strategies more than using social media and marketing?
- What if...we focused more on creating community ownership of change than "buy in" to the change?
- What if...we used principles to guide our work rather than checklists, protocols, and performance measures?
- What if...we were to build trust before trying to change things?
- What if...we shared the leadership and did not insist on being out front?
- What if...we were to conduct evaluation that is focused on developing a better effort rather than measuring achievement of outcomes?
- What if...we were to embrace the risk of "better practices that may work"?
Email Tom: firstname.lastname@example.org
Visit Tom's website: www.tomklaus.net