From a post by Scott Alexander: Toward a Bayesian Theory of Willpower
This is something that matters to us all a thousand times a day, right? So I can’t say that I immediately know what a “Bayesian theory” would be, except something that has to do with statistics and probability, but I’m certainly interested in what Scott has to say about this.
He starts this way:
Five years ago, I reviewed Baumeister and Tierney’s book on the subject. They tentatively concluded it’s a way of rationing brain glucose. But their key results have failed to replicate, and people who know more about glucose physiology say it makes no theoretical sense. … Robert Kurzban, one of the most on-point critics of the glucose theory, gives his own model of willpower: it’s a way of minimizing opportunity costs. But how come my brain is convinced that playing Civilization for ten hours has no opportunity cost, but spending five seconds putting away dishes has such immense opportunity costs that it will probably leave me permanently destitute? I can’t find any correlation between the subjective phenomenon of willpower or effort-needingness and real opportunity costs at all.
Dishes I don’t mind, but do you realize that if you put off folding and putting away clean laundry, eventually you’ll wear all those socks again and can throw them back in the laundry basket without ever having to fold them at all? So actually the opportunity cost attaches to folding socks rather than to leaving them in a heap on top of the drier. I’m just saying.
Scott goes on for a bit and then concludes his summaries of other theories like this:
I’ve come to disagree with all of these perspectives. I think willpower is best thought of as a Bayesian process, ie an attempt to add up different kinds of evidence. … At the deepest level, the brain … is an inference engine, a machine for weighing evidence and coming to conclusions. Your perceptual systems are like this – they weigh different kinds of evidence to determine what you’re seeing or hearing. Your cognitive systems are like this, they weigh different kinds of evidence to discover what beliefs are true or false. Dopamine affects all these systems in predictable ways. My theory of willpower asserts that it affects decision-making in the same way – it’s representing the amount of evidence for a hypothesis.
A long post follows, at the end of which Scott concludes:
I think the most immediate gain to having a coherent theory of willpower is to be able to more effectively rebut positions that assume willpower doesn’t exist, like Bryan Caplan’s theory of mental illness. If I’m right, lack of willpower should be thought of as an imbalance between two brain regions that decreases the rate at which intellectual evidence produces action. This isn’t a trivial problem to fix!
The lines here are perfectly straight – feel free to check with a ruler. Can you force yourself to perceive them that way? If not, it sounds like you can’t always make your intellectual/logical system overrule your instincts, which might make you more sympathetic to people with low willpower.
Those blue lines really are straight. I actually did get out a ruler and measure. I’m not actually sure this is a reasonable way to think about “making intellectual systems override instincts,” because in some ways instincts do seem like perception, but in other ways they really do not, and this is a visual perception illusion straight up. But it’s still a fun illusion.