There’s an old saying, “If you don’t stand for something, you’ll fall for anything.” Unfortunately, it’s probably got things exactly backward. The more you stand for and the more beliefs you strongly hold, the more likely you are to fall for anything—anything that confirms your existing beliefs.

There’s only one way to fight back, to not fall for anything: skepticism. Skepticism about everything—what you “know,” what you believe, even your own motives—is one of the most critical habits of mind to cultivate (I believe, but take that with a grain of salt).

The critical importance of skepticism as a habit of mind has been brought home to me in countless ways over the last few months. First, as I read through Thinking Fast and Slow, Daniel Kahneman’s summary of a lifetime of work on cognitive biases, I have to constantly remind myself that these biases apply to me. Even as I read, my mind is twisting the facts into a state more palatable for my own self-image.

Second, as I was reading and preparing to review Resilience for the fall issue of Stanford Social Innovation Review, I read a spate of reviews of similar “pop” science books. Almost all of the reviews, no matter the book or the author, carried common complaints that complexities were oversimplified, contrary evidence was ignored, and facts were overstated. I realized that the true problem the critics were identifying was not a common fault of a character in the authors of pop science books, but a fault of character in readers of pop science (and everything else): We carry the same suspension of disbelief, the same suspension of skepticism, into our reading that we carry into summer movie theaters. As a result, we allow ourselves to be swayed far too easily by the simple explanations and overstated facts we encounter.

Are you enjoying this article? Read more like this, plus SSIR's full archive of content, when you subscribe.

Third, as the final shoe has dropped on the Jonah Lehrer saga in the last week (he massaged and potentially made up quotes to support his thesis in his recent book Imagine), I began to wonder what would lead anyone—including me—to play fast and loose with the facts. While I can’t speak for Lehrer, as a writer it’s easy to recognize the temptation to make your case just a bit stronger, when you are convinced of the soundness of your core point. Even worse, it’s entirely possible to hear something different than what is actually being said. For instance, in the recent past I cited something I learned from a terrific book on entrepreneurship by Scott Shane—except that my mind had skipped over the “not” in one Shane’s points. And so I—believing I was right and sharing facts—was in fact writing the exact opposite. Thankfully a reader caught my error, or I would still be living with a wrong fact in my head.

This all matters very much to leaders in the social sector because we are ultimately trying to get people to change—to change their minds, to change their circumstances, to change their actions—and thereby change the world. As Duncan Watts points out, social change isn’t rocket science. It’s actually much harder, with less data and fewer constants.

That’s why it’s so important to remain skeptical, to reexamine assumptions, to consider alternatives. If we are going to make progress, we have to be willing to acknowledge and confront our cognitive biases. What we are doing and why, how we are doing it and where, who we are working with and when. We have to lose the courage of our convictions and be open to being wrong.

Remaining skeptical doesn’t mean that you have to become a cynic. Nor does it mean that you give up on core convictions. It just means becoming willing to examine and test those core convictions and assumptions, and confirm that they are worth holding.

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Timothy Ogden.