(Photo by iStock/SACheckley)
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”—Mark Twain*
When philanthropists are passionate about the causes they support, it’s no surprise that they are more likely to partner with organizations that share their values and goals. But while this alignment can result in strong partnerships and collaborations, it can also open up the risk of confirmation bias: wanting things to be true even when evidence suggests otherwise. Especially as philanthropy has been shifting its focus from individual grants and programs to investing in broader systemic change, addressing confirmation bias becomes all the more urgent: the tools, practices, and cultures for evaluating evidence and learning have not kept pace with this shifting scale. Compared to other sectors, philanthropy lacks robust mechanisms to tackle these cognitive biases, and attending to evidence suggesting the program or intervention may not be achieving the intended impact or generating desired outcomes.
To put it simply “Disconfirming evidence” is any information, data, or findings that challenge or contradict the initial assumptions or theories that were used to design a program or make an investment. It exists on a spectrum; it can be obvious how a piece of evidence fundamentally challenges core assumptions, but sometimes it is difficult to interpret in the moment (while important not to dismiss). It can also be a subtle, weak signal that things are not progressing in the way you thought they might, and therefore easy to ignore.
If such evidence is acknowledged and acted upon, it helps us to guard against confirmation bias, and to design and fund better programs. But philanthropy has few established practices for attending to, much less reacting to, disconfirming evidence. Indeed, there are often incentives to ignore
disconfirming evidence, anything from risk aversion and an over-investment in a particular hypothesis to a desire to maintain established relationships or even just a passion for an outcome.
Are you enjoying this article? Read more like this, plus SSIR’s full archive of content, when you subscribe.
The issue is not unique to philanthropy, but the cost of being wrong is often much higher in other sectors. Harms tend to be specific, identifiable, and consequential, as when ignoring disconfirming evidence leads to a wrongful conviction, a misdiagnosed patient, an inaccurate newspaper story, or software failure. By contrast, in philanthropy, causes and effects can be hard to establish, and so the costs of disregarding disconfirming evidence are often less clear. But if the harms are ambiguous, they are no less consequential.
In other sectors – from medicine to the law, from business to scientific research – practices, structures, and cultures have been established to confront the challenge of disconfirming evidence explicitly:
Practices: In scientific research, well-established practices like peer review and external assessment mechanisms like replication studies provide a critical foundation for ensuring the accuracy and reliability of findings. Training in statistical and research methods helps to establish a baseline for what counts as robust scientific evidence. Journalists have professional norms to manage confirmation bias, with clear established best practice in fact-checking, sourcing, transparency, and fairness captured in codes of practice and editorial guidelines. And in medicine, the practice of differential diagnosis helps medical professionals systematically identify the correct condition by considering and ruling out multiple potential diagnoses based on gathered evidence. This avoids confirmation bias by ensuring that all possible causes are evaluated objectively rather than relying on initial impressions or assumptions.
Structure: The legal system is a good example of a system designed to consider all forms of evidence, and to judge on balance which evidence should be relied upon. Experts within an explicitly adversarial system present their arguments. The whole structure of the law is designed to evaluate which evidence is most compelling. Mechanisms like appeal processes mitigate the risk of errors in practice and behavior, providing formal avenues for reevaluating decisions and challenging initial conclusions. In the finance sector, particularly in venture capital, an acknowledgment that confirmation bias is challenging to manage is tackled at the portfolio level: Investment committees rigorously stress test individual decisions until the collective is satisfied that the risks are acknowledged and manageable. This process ensures that even if individual investments are risky, the overall portfolio remains robust and capable of weathering potential failures. In a system where confirmation bias might be unavoidable (say in uncertain environments like early-stage funding), portfolios are set up in recognition of that risk.
Culture:
Fostering inquisitiveness and openness to challenging long-held beliefs is essential in business, where competition punishes those who fall prey to confirmation bias. Businesses are successful in competitive markets when they view disconfirming evidence positively, as a tool for improvement, and are open to diverse perspectives and new information. In the law, confirmation bias is identified as a risk early in legal training, and cultural norms are explicitly set down, especially for the judiciary. Similarly, in medicine, standards of evidence-based medicine and ongoing professional development challenge professionals to enrich their knowledge and update their core assumptions.
None of these systems are perfect—witness debates over the replication crisis in science or the fights over professional journalistic standards—but in each case, the importance of disconfirming evidence is explicitly built into the profession, a problem that needs to be actively managed and addressed.
What might philanthropy’s version of peer review, or double-sourcing, judicial appeal or a cross-disciplinary investment committee be? What might be the equivalent practices, structures, and cultures?
We see three opportunities to tackle disconfirming evidence in philanthropy: learning, change, and culture.
1. Learning: We have to equip teams with the incentives, tools and frameworks to identify and discuss disconfirming evidence. Teams need to take steps to encourage open dialogue and critical thinking to foster a culture of continuous improvement. This is not just about considering the evidence, but where the evidence came from.
2. Change: We just be more explicit about implementing organizational strategies that prioritize the acknowledgment and integration of disconfirming evidence. This requires a shift in mindset and a commitment to transparency and accountability. The more we think about system change, the more we need to think about disconfirming evidence at the level of the whole portfolio, rather than the individual grant.
3. Culture: We need to foster a culture that values humility and diverse perspectives and embrace the uncertainty of working in complex systems. Our work requires experimentation, and our environments should see disconfirming evidence as an opportunity for growth rather than a threat.
This kind of thinking is new for our sector. We have seen isolated examples of good practice, but we do not have a rich supply of case studies or established norms to draw upon. However, we can build on best practice in other sectors. Below we suggest some practical experiments to start making disconfirming evidence a more central part of philanthropic work.
1. Learning. Equip teams with tools and frameworks to encourage open dialogue and critical thinking by:
2. Change. Build processes and structures that acknowledge the importance of disconfirming evidence by:
3. Culture. Foster humility and embrace uncertainty. Build a culture that uses disconfirming evidence as an opportunity for growth:
Grantmaking always involves strategic uncertainty, because, when working in complex systems, we must always recognize that our mental models may be incomplete or flawed. The work often involves experimentation, and by definition, some experiments will not yield the expected results. In this context, we should view disconfirming evidence not as a setback, but as a valuable opportunity to learn, test whether our hypotheses are still relevant, and refine our approaches. Embracing this uncertainty and the iterative nature of our work can lead to more robust strategies and ultimately, greater impact.
This is a serious sectoral challenge. Disconfirming evidence in other professions is addressed by the profession as a whole: the norms in law and science are more effective for being set and understood across the sector. We need to think more about training, norms, and commonly understood practices across philanthropy as a whole.
*This quote is widely attributed to Mark Twain. It has been used, among other places, in the introduction to An Inconvenient Truth. It captures the sentiment of this article beautifully. However, there’s no evidence to suggest that Mark Twain said it. In fact, there’s plenty of disconfirming evidence to suggest he did not say it. So what do we do? Not use the quote? Not attribute it? Put it in the footnotes? Build the article around it? This is the problem of disconfirming evidence, illustrated.
Support SSIR’s coverage of cross-sector solutions to global challenges.
Help us further the reach of innovative ideas. Donate today.
Read more stories by Kecia Bertermann & Andy Martin.