Author Abstract
A common critique of models of mistaken beliefs is that people should recognize their error after observations they thought were unlikely. This paper develops a framework for assessing when a given error is likely to be discovered, in the sense that the error-maker will deem her mistaken theory implausible. The central premise of our approach is that people channel their attention through the lens of their mistaken theory, meaning a person may ignore or discard information her mistaken theory leads her to consider unimportant. We propose solution concepts embedding such channeled attention that predict when a mistaken theory will persist in the long run even with negligible costs of attention, and we use this framework to study the “attentional stability” of common errors and psychological biases. While many costly errors are prone to persist, in some situations a person will recognize her mistakes via “incidental learning”: when the data she values, given her mistaken theory, happen to also tell her how unlikely her theory is. We investigate which combinations of errors, situations, and preferences tend to induce such incidental learning vs. factors that render erroneous beliefs stable. We show, for example, that a person may never realize her self-control problem even when it leads to damaging behavior and may never notice the correlation in others’ advice even when that failure leads her to follow repetitive advice too much. More generally, we show that for every error there exists an environment where the error persists and is costly. Uncertainty about the optimal action paves the way for incidental learning, while being dogmatic creates a barrier.
Paper Information
- Full Working Paper Text
- Working Paper Publication Date: June 2018
- HBS Working Paper Number:
- Faculty Unit(s): Negotiation, Organizations & Markets