are delusions pathologies of belief?



I've not previously devoted much time to cognitive neuropsychiatric theories of delusion, and I really need to put this right. What follows considers the cognitive neuropsychiatric theorisation of delusion offered by Anne Aimola Davies & Martin Davies (henceforth: Davies) in their Explaining Pathologies of Belief which appears as chapter 15 of Matthew Broome & Lisa Bortolotti's (eds) Psychiatry as Cognitive Neuroscience  -  Mary Warnock's 'book of the year' in 2009. A value of this paper is that it aims to provide a general theory, or at least theoretical framework, for delusion in psychiatric disorders, rather than restrict itself, as much of this tradition has done, to the somewhat obscure monothematic delusions found in rather rare conditions such as Capgras or Cotard syndromes.

Davies' approach is firmly rooted in the methods and assumptions of cognitive neuroscience, and as such belongs to a tradition (call it the 'cognitive tradition') which aims to build the psychological theory required to understand psychopathological phenomena by drawing both on general psychological models of cognition and on empirical investigation of specific neurological impairments, so that this rather rigorously developed theory can then be applied to clinical cases. One could say that it's an attempt to grasp the psychopathological phenomena 'from the outside', and is of a piece with the 'scientist-practitioner' model in clinical psychology.

Davies with Yeti
Davies with Chalmers
This can be contrasted with (call it) the 'clinical tradition' which I would say has been generated mainly by clinicians out of an immersion in psychopathological phenomena, with an attempt to do justice to the phenomenology and dynamics of the phenomena they encounter there. This often rather less rigorously developed tradition has flowered intellectually principally in phenomenological and psychoanalytical forms, but mainly ticks along in more intuitive and less articulated forms in much clinical psychiatric practice. At times the clinical tradition uses theory and science to help refine or validate its own deliverances, but it is primarily experience- rather than science- led. In the UK I think it's fair to say that the cognitive tradition has a place in certain universities (in London and Oxford for example) and to some degree in their associated clinics, and the clinical tradition more or less ticks along all over the place, although fairly often in rather impoverished (e.g. in excessively procedural or medicalised or legalised or target-obsessed) forms that sap out of it much of its wisdom.

At any rate, I ought to own that I have so far found my happy home in what I'm calling the clinical rather than the cognitive tradition. (En passant: Risks of the clinical tradition: bad science, introversion, confirmation bias, failure to distinguish expert clinical judgement from inherited prejudice. Risks of the cognitive tradition: unwittingly attending to the wrong phenomena, lack of a feel for the material, theorisation unconstrained by intuition born of engaged encounters, importing of dubious epistemological and metapsychological assumptions from cognitive science into psychopathology, etc.) And so what follows is something of an attempt to peer over the garden wall at the rather different flowers growing on the other side.

Davies' first sentence has it that 'In a case of delusion, belief goes wrong.' They continue 'Delusions are pathologies of belief.' The rest of their investigation - and, also, we might note, the investigations of other cognitive tradition delusion theorists - follows from this point. I think it is fair to say that they take it completely for granted. The thing is, though, that the clinical tradition really hasn't taken this for granted at all. I myself don't take it for granted; in fact I am at present intuitively inclined to disagree with it.

To be sure, everyone would agree that delusions are at least sometimes (e.g. when they're not moods or perceptions) pathological forms of belief (well - excepting those who find delusions so wanting by way of the allegedly normal and essential properties of belief (Davidson's constitutive principle of rationality etc.) that they are reluctant to label them as such - although even in these cases they would, one imagines, be happy to talk about 'delusional [i.e. in some way totally knackered] beliefs'). But just because delusions are pathological beliefs does not mean that they are pathologies of belief. It does not entail, that is, that delusion must be understood as consequent on or as maintained by disturbances in the processes by which beliefs are normally formed and maintained. Maybe delusions often have their home in a quite different set of mental functions, functions that are not geared up to making sense of the world around one, and that what drops out the end of these functions merely masquerades as a belief of the normal sort. Perhaps, for example, delusions are ersatz beliefs formed by dreaming, and the delusional mind has become unable, in some specific domain or topic, to distinguish dreaming from thinking. I'm not sure that this would be quite right, but you get my drift: it doesn't have to be about wonky attempted grasps of the world, wonky attempted sense-makings of or uncritical uptake of one's also possibly wonky experience. I've no doubt that having various cognitive deficits in the faculties which support normal veridical belief formation and maintenance may be a big help to the delusional subject aiming, as it were, to cleave to their delusion. Whether the origin of your typical schizophrenic delusion can be understood in such general cognitive terms is, however, a different matter.

Let me just provide a nutshell summary of Davies' theory. It is a two factor, three stage theory. Or rather it is a theoretical framework, since it is suggested that different delusions will have different causes - they will find their own home somewhere in this two factor three stage account. The first factor: how do delusions come about? The second: how are they maintained? (The consideration here is that there will be a different explanation in the two cases.) The first stage: experience. The second: hypothesis generation about what is experienced. The third: endorsement of a hypothesis and creation of a belief. Delusions may involve the straightforward endorsing uptake of abnormal experience, or they may represent manqué attempts to explain abnormal experience. In the second stage attributional biases, jumping to conclusions, a failure of pre-existing beliefs to constrain the uncritical uptake of abnormal experiences, ignoring alternative explanations and so on are offered as explanations of what the first factor could consist in. These are largely presented as personal-level phenomena but Davies also accept that subpersonal processes can also find a place in the same (now really rather broad) framework. The second factor includes the following 'Some patients may fail to reject their false belief because they do not make proper use of available disconfirming evidence, others because they do not take proper account of the belief's implausibility' (15.5.1). A Capgras patient (spouse impostor delusion) fails to notice their belief's implausibility; a Cotard patient ignores what we would normally think of as evidence of life. Working memory and inhibitory executive processes are required for the evaluation of beliefs. An Anosognosic patient, for example, may continue to have illusions of movement in their paralysed arm. If they also have right frontal damage then they may also struggle to hold on to the idea that this really is illusory, and be left with delusional beliefs about their capacity to move it.


Now this is all intriguing stuff. My first (psychological) thought however is that it would be nice to see how it applies to typical (i.e. schizophrenic, manic and depressive) delusions, and not just to the 'more neurological' cases. As yet I'm also more persuaded by the second factor than by the first. My second (philosophical) thought is that I'm somewhat unsettled by the epistemology that appears to be embedded in this framework. What I have in mind, in particular, is the rather 'empiricist-theory-of-the-mind-ish' set of ideas that maps various human capacities onto stages of an inner process. The notion, for example, that forming perceptual beliefs involves the having of experiences, followed by the entertaining of various hypotheses, and then the plumping for one or other of these hypotheses in the creation of a belief, strikes me as phenomenologically rather implausible. I don't, for example, think it happens to me very often. It is of course always open for the cognitivist to say: well the hypothesis formation etc. are either descriptively unconscious personal-level phenomena, or are subpersonal phenomena. But then it is surely incumbent on them to now provide the distinct criteria for such unconscious hypothesis formation - and it is my sense from the literature, and from this chapter, that this provision of criteria very rarely happens. (Options such as 'well it's just like conscious hypothesis formation, only unconscious' ('an imaginary egg is just like a real egg except there's no egg there': thank you so much), or 'the model is an inference to the best explanation' (but we're wanting to understand what the explananda are so that we can then understand the explanation), or 'the causal relations between neurological states map onto the inferential relations in the model' (according to which mapping rules chosen why?) are I suspect unlikely to convince many of us these days.)

Davies do say that they are happy with the idea that a patient may simply endorse rather than explain their experiences, and they also say that, for them, 'seeing is believing'. But this it turns out is not to be taken literally. Rather what we are offered is the suggestion that there is (293) a 'prepotent doxastic response of treating a perceptual experience as veridical'. Perception and belief are still positioned as stages of a process with cognitive mechanisms intervening. (To clarify: it is not that we are being offered a rule of grammar which says: someone is just to be said to believe what they see unless they are baulking. Rather we are offered an empirical proposition of the form: perception tends to automatically give rise to belief unless inhibitory mechanisms are working.) There is a 'processing stage that leads from experience to belief' (15.4.4). So: 'A delusion is a belief, but having a deficit or experience is not yet having a belief; it is not even having a hypothesis that could be adopted as a belief. A complete answer to the question will have to appeal to a processing stage that leads from deficit or experience to belief. This is the idea that the two-factor framework is also a three-stage framework' (290).

Again, in my experience I don't tend to treat my perceptual experiences as veridical, since I don't really treat them at all. I am instead, in my perceptual experience, open to parts of the world, (hopefully) taking in the facts. I am not, apart from in some of my more as-it-were schizoid moments, set back from the world inspecting the deliverances of perception and either admitting them or otherwise. Perception, it seems to me, really is believing (when what we have to do with are perceptual beliefs!) - or better, it is usually knowing (which I guess isn't obviously to be taken as a species of, or as implying the presence of, belief).

Anyway, lets move back from the epistemology to the psychology. Within the clinical tradition the failure in 'reality contact' or 'reality testing' which is seen to be manifest in primary (core schizophrenic) delusion is not typically understood as a disturbance in belief formation. The concept of 'reality testing', one could say, is just not the concept of 'hypothesis testing'. Rather what has happened is that a 'part of the mind' has become 'autistic' (in Bleuler's rather than Kanner's sense) or 'psychotic'. In this lifeworld-retreated part of the mind the distinction between fantasy and reality, imagination and world-directed thought, has broken down. And delusion crystallises. Which crystallisation, to anticipate, is not explanation...

Davies consider cases in which psychotic experiences less than fully encode the content of the delusion (e.g. perhaps the experience is just a sense of unfamiliarity or diffuse threat). In these cases, we are told, 'the processing stage that leads from experience to belief must involve substantive explanatory processes of hypothesis generation and confirmation.' (290) From the standpoint of the clinical tradition I don't see why this must be the case. Let us imagine that someone shifts from a state of prodromal tréma to one of delusional stability. 'Something is going on, I don't know what' becomes 'The wardens are planning to irradiate the hospital'. On the cognitivist hypothesis what has happened is that the patient has made sense of why they felt the way they did. On the clinical hypothesis what has happened is that a more manageable belief has been substituted for a less manageable terror, and is maintained because of its powerful restorative function. Diffuse threat becomes focal and thereby thinkable. Thinkable terrors are more bearable, because we get some of our agency back. When Freud wrote that delusions were 'patches over rents in the ego', his writing was not, it seems to me, unnecessarily poetic. Instead he was talking about the restoring of the patient's self-identity, their recovery of their going-on-being, through their 'invention', by projection of threat into the environment, of the delusional belief. (The rub is that then you have to live with the persecuting irradiating wardens, but this is still better than falling apart.) Once again, what seems to me to be happening is akin to an insurgence of fantasy to bind over ruptures in the delusional subject's self-identity - and it is not at all obvious that this 'must involve substantive explanatory processes of hypothesis generation and confirmation.' Far from it, I want to suggest: it rather seems to me that the patient is playing a different game altogether.

What I've been calling the clinical tradition also has at least some resources for explaining some of the other moves of this different game - such as double bookkeeping. The cognitive tradition will unsurprisingly have little to say about this: if delusions are best understood as pathologies of belief formation and maintenance, then it is hard to see how the patient can both succeed and fail at these tasks at one and the same time. The clinical tradition, on the other hand, sees delusion as growing out of, and being sustained by, quite different soil than belief - as different as dream and unconscious phantasy are from our formation of beliefs (and even more so from hypothesis testing). I can after all have a fantasy about a horse and see a horse at the same time.The autistic enclave is not only governed by different rules, but is also a quite different regime.

Comments

Popular Posts