A repost, but I figure you people will appreciate this:
Anosognosia, a condition in which extremely sick patients mysteriously deny their sickness, occurs during right-sided brain injury but not left-sided brain injury. It can be extraordinarily strange: for example, in one case, a woman whose left arm was paralyzed insisted she could move her left arm just fine, and when her doctor pointed out her immobile arm, she claimed that was her daughter's arm even though it was obviously attached to her own shoulder. Anosognosia can be temporarily alleviated by squirting cold water into the patient's left ear canal, after which the patient suddenly realizes her condition but later loses awareness again and reverts back to the bizarre excuses and confabulations.
It is but one of the monothematic delusions in the class Yvain is addressing. For instance, you also have the Capgras.
In the Capgras delusion, the patient, usually a victim of brain injury but sometimes a schizophrenic, believes that one or more people close to her has been replaced by an identical imposter. For example, one male patient expressed the worry that his wife was actually someone else, who had somehow contrived to exactly copy his wife's appearance and mannerisms.
What do these types of delusions (and some forms of schizophrenia) have in common? Generally, damage to the right frontal lobe of the brain. Beyond that, several different classes of theories have been advanced. There is, for example, a "single factor" theory based on "belief shift" for anosognosia.
Ramachandran suggested that the left brain is an "apologist", trying to justify existing theories, and the right brain is a "revolutionary" which changes existing theories when conditions warrant. If the right brain is damaged, patients are unable to change their beliefs; so when a patient's arm works fine until a right-brain stroke, the patient cannot discard the hypothesis that their arm is functional, and can only use the left brain to try to fit the facts to their belief.
Alas, a "belief shift" theory of this sort, while plausible in the case of anosognosia, doesn't seem to work well for other, similar monthematic delusions with similar patterns of brain damage. One might plausibly have difficulty shifting from the belief that one's leg is healthy to the belief that one's leg is unhealthy based on a single type of damage. But what
about the belief that one's spouse is a body-snatcher type alien? It's an unreasonable belief on its face. It's not simply that the patient might have damage that makes them unable to update their beliefs to fit a new reality; an overarching theory must also explain why any sort of credence is given by the patient to beliefs that fit no reality.
Enter "two factor" theories. These posit that there is both damage that prevents belief updating (i.e. nerve damage that prevents the brain from feeling emotion on seeing one's spouse) and a second sort of damage that forces a fairly wacky reason for the sensory input produced by the first sort of damage (i.e. I feel nothing, so my wife must actually be a really clever impostor!) This gets around the fact that "single factor" theories, while viable for some simple delusions, would in other cases require the patient to be (to quote Yvain) "really stupid". After all, if confronted with a lack of emotional or physical response, you'd have to pretty dumb/gullible to leap to the idea that "my wife is actually a disguised spy" or "my hand is being controlled by aliens". Given that the distribution of intelligence in patients seems to be an otherwise normal one, this is unreasonable. Additionally, many patients with similar forms of brain damage do not develop delusions about why this is the case, but simply soldier on with a handicap like not being able to feel their arm, etc.
The difference appears to lie in the presence or absence of another, more specific form of damage, to the right dorsolateral prefrontal cortex (RDPC.) It is only those with damage to the RDPC that appear to develop not just sensory issues, but wacky theories to explain those sensory issues to themselves. The earlier theory then takes the RDCP to be part of a belief evaluation module in the brain, and posits that damage to it makes it difficult to update one's beliefs.
In his first papers on the subject, Coltheart vaguely refers to the RDPC as a "belief evaluation" center. Later, he gets more specific and talks about its role in Bayesian updating. In his chronology, a person damages the connection between face recognition and emotion, and "rationally" concludes the Capgras hypothesis. In his model, even if there's only a 1% prior of your spouse being an imposter, if there's a 1000 times greater likelihood of you not feeling anything toward an imposter than to your real spouse, you can "rationally" come to believe in the delusion.
But as Yvain notes, this is still problematic. People don't walk around randomly thinking that their wife is a spy or their arm belongs to an alien, so you're still left with the puzzle of why such an unlikely belief might get into position in the first place even if you can explain why it can't be taken out later.
Enter the McKay theory, which posits that the problem is with the patients' ability to use Bayesian priors. For those of you unfamiliar, in simplified fashion the idea in Bayesian statistics is that your use of current evidence should be taken not in isolation, but as modifying your prior beliefs about the situation. So if you happened to get a bit of evidence that, say, the President was born in Kenya or Mitt Romney never paid any taxes, you should (unless you had already developed very strong priors in that direction based on other evidence) tend to modify your belief in the likelihood of such a thing only a little. But our RDPC patients apparently can't do that which results in what Yvain calls the Super Base Rate Fallacy.
For them the only important criterion for a belief is explanatory adequacy. So when they notice their spouse's face no longer elicits any emotion, they decide that their spouse is not really their spouse at all. This does a great job of explaining the observed data - maybe the best job it's possible for an explanation to do. Its only minor problem is that it has a stupendously low prior, and this doesn't matter because they are no longer able to take priors into account. This also explains why the delusional belief is impervious to new evidence. Suppose the patient's spouse tells personal details of their honeymoon that no one else could possibly know. There are several possible explanations: the patient's spouse really is the patient's spouse, or (says the left-brain Apologist) the patient's spouse is an alien who was able to telepathically extract the relevant details from the patient's mind. The telepathic alien imposter hypothesis has great explanatory adequacy: it explains why the person looks like the spouse (the alien is a very good imposter), why the spouse produces no emotional response (it's not the spouse at all) and why the spouse knows the details of the honeymoon (the alien is telepathic). The "it's really your spouse" explanation only explains the first and the third observations.
I think this theory has a certain elegance to it. Additionally, it begins (if correct) to isolate part of the brain as being involved in higher-order functions like prediction and decision making under uncertain conditions. Very cool stuff!