Sam McNerney:

But then something unexpected occurred. Suddenly and paradoxically I only saw the world through the lens of this research. I made inaccurate judgments, illogical conclusions and was irrational about irrationality because I filtered my beliefs through the literature on decision-making – the same literature, I remind you, that warns against the power of latching onto beliefs. Meanwhile, I naively believed that knowledge of cognitive biases made me epistemically superior to my peers (just like knowing Sartre made me more authentic).

Only later did I realize that learning about decision-making gives rise to what I term the confirmation bias bias, or the tendency for people to generate a narrow (and pessimistic) conception of the mind after reading literature on how the mind thinks narrowly. Biases restrict our thinking but learning about them should do the opposite. Yet you’d be surprised how many decision-making graduate students and overzealous Kahneman, Ariely and Gilbert enthusiasts do not understand this. Knowledge of cognitive biases, perhaps the most important thing to learn for good thinking, actually increases ignorance in some cases. This is the Sartre Fallacy – we think worse after learning how to think better.

I can’t recall any particular examples off the top of my head, but this theme is prevalent in a lot of writing about Zen, too — if you think you’ve become enlightened, it’s a pretty good sign you haven’t. But if you think denying that you’re enlightened means that you are, then you’re wrong.