Neural Conspiracy Theories

140775790_e3e122cd65_bLast month, a paper quietly appeared in The Journal of Neuroscience to little fanfare and scant media attention (with these exceptions). The study revolved around a clever and almost diabolical premise: that using perceptual trickery and outright deception, its authors could plant a delusion-like belief in the heads of healthy subjects. Before you call the ethics police, I should mention that the belief wasn’t a delusion in the formal sense of the word. It didn’t cause the subjects any distress and was limited to the unique materials used in the study. Still, it provided a model delusion that scientists Katharina Schmack, Philipp Sterzer, and colleagues could study to investigate the interplay of perception and belief in healthy subjects. The experiment is quite involved, so I’ll stick to the coolest and most relevant details.

As I mentioned in my last post, delusions are not exclusive to people suffering from psychosis. Many people who are free of any diagnosable mental illness still have a tendency to develop them, although the frequency and severity of these delusions differ across individuals. There are some good reasons to conduct studies like this one on healthy people rather than psychiatric patients. Healthy subjects are a heck of a lot easier to recruit, easier to work with, and less affected by confounding factors like medication and stress.

Schmack, Sterzer, and colleagues designed their experiment to test the idea that delusions arise from two distinct but related processes. First, a person experiences perceptual disturbances. According to the group’s model, these disturbances actually reflect poor expectation signals as the brain processes information from the senses. In theory, these poor signals would make irrelevant or commonplace sights, sounds, and sensations seem surprising and important. Without an explanation for this unexpected weirdness, the individual comes up with a delusion to make sense of it all. Once the delusion is in place, so-called higher areas of the brain (those that do more complex things like ponder, theorize, and believe) generate new expectation signals based on the delusion. These signals feed back on so-called lower sensory areas and actually bias the person’s perception of the outside world based on the delusion. According to the authors, this would explain why people become so convinced of their delusions: they are constantly perceiving confirmatory evidence. Strangely enough, this model sounds like a paranoid delusion in its own right. Various regions of your brain may be colluding to fool your senses into making you believe a lie!

To test the idea, the experimenters first had to toy with their subjects’ senses. They did so by capitalizing on a quirk of the visual system: that when people are shown two conflicting images separately to their two eyes, they don’t perceive both images at once. Instead, perception alternates between the two. In the first part of this experiment, the two images were actually movies of moving dots that appeared to form a 3-D sphere spinning either to the left (for one eye) or to the right (for the other). For this ambiguous visual condition, subjects were equally likely to see a sphere spinning to the right or to the left at any given moment in time, with it switching direction periodically.

Now the experimenters went about planting the fake belief. They gave the subjects a pair of transparent glasses and told them that the lenses contained polarizing filters that would make the sphere appear to spin more in one of the two directions. In fact, the lenses were made of simple plastic and could do no such thing. Once the subjects had the glasses on, the experimenters began showing the same movie to both eyes. While this change allowed the scientists to control exactly what the subjects saw, the subjects had no idea that the visual setup had changed. In this unambiguous condition, all subjects saw a sphere that alternated direction (just as the ambiguous sphere had done), except that this sphere spun far more in one of the two directions. This visual trick, paired with the story about polarized lenses, was meant to make subjects believe that the glasses caused the change in perception.

After that clever setup, the scientists were ready to see how the model delusion would affect each subject’s actual perception. While the subject continued to wear the glasses, they were shown the two original, conflicting movies to their two separate eyes. In the first part of the experiment, this ambiguous condition caused subjects to see a rotating sphere that alternated equally between spinning to the left and right. But if their new belief about the glasses biased their perception of the spinning sphere, they would now report seeing the sphere spin more often in the belief-consistent direction.

What happened? Subjects did see the sphere spin more in the belief-consistent direction. While the effect was small, it was still impressive that they could bias perception at all, considering the simplicity of the images. They also found that each subject’s delusional conviction score (how convinced they were by their delusional thoughts in everyday life) correlated with this effect. The more the subject believed her real-life delusional thoughts, the more her belief about the glasses affected her perception of the ambiguous spinning sphere.

But there’s a hitch. What if subjects were reporting the motion bias because they thought that was what they were supposed to see and not because they actually saw it? To answer this question, they recruited a new batch of participants and ran the experiment again in a scanner using fMRI.

Since the subjects’ task hinged on motion perception, Sterzer and colleagues first looked at the activity in a brain area called MT that processes visual motion. By analyzing the patterns of fMRI activity in this area, the scientists confirmed that subjects were accurately reporting the motion they perceived. That may sound far-fetched, but this kind of ‘mind reading’ with fMRI  has been done quite successfully for basic visual properties like motion.

The group also studied activity throughout the brain while their glasses-wearing subjects learned the false belief (unambiguous condition) and allowed the false belief to more or less affect their perception (ambiguous condition). They found that belief-based perceptual bias correlated with activity in the left orbitofrontal cortex, a region just behind the eyes that is involved in decision-making and expectation. In essence, subjects with more activity in this region during both conditions tended to also report lopsided spin directions that confirmed their expectations during the ambiguous condition. And here’s the cherry on top: subjects with higher delusional conviction scores appeared to have greater communication between left orbitofrontal cortex and motion-processing area MT during the ambiguous visual condition. Although fMRI can’t directly measure communication between areas and can’t tell us the direction of communication, this pattern suggests that the left orbitofrontal cortex may be directly responsible for biasing motion perception in delusion-prone subjects.

All told, the results of the experiment seem to tell a neat story that fits the authors’ model about delusions. Yet there are a couple of caveats worth mentioning. First, the key finding of their study – that a person’s delusional conviction score correlates with his or her belief-based motion perception bias – is built upon a quirky and unnatural aspect of human vision that may or may not reflect more typical sensory processes. Second, it’s hard to say how clinically relevant the results are. No one knows for certain if delusions arise by the same neural mechanisms in the general population as they do in patients with illnesses like schizophrenia. It has been argued that they probably do because the same risk factors pop up for patients as for non-psychotic people with delusions: unemployment, social difficulties, urban surroundings, mood disturbances and drug or alcohol abuse. Then again, this group is probably also at the highest risk for getting hit by a bus, dying from an curable disease, or suffering any number of misfortunes that disproportionately affect people in vulnerable circumstances. So the jury is still about on the clinical applicability of these results.

Despite the study’s limitations, it was brilliantly designed and tells a compelling tale about how the brain conspires to manipulate perception based on beliefs. It also implicates a culprit in this neural conspiracy. Dare I say ringleader? Mastermind? Somebody cue the close up of orbitofrontal cortex cackling and stroking a cat.

_____

Photo credit: Daniel Horacio Agostini (dhammza) on Flickr, used through Creative Commons license

Schmack K, Gòmez-Carrillo de Castro A, Rothkirch M, Sekutowicz M, Rössler H, Haynes JD, Heinz A, Petrovic P, & Sterzer P (2013). Delusions and the role of beliefs in perceptual inference. The Journal of Neuroscience, 33 (34), 13701-13712 PMID: 23966692

Delusions: Making Sense of Mistaken Senses

6738201_646b9e485b_o

For a common affliction that strikes people of every culture and walk of life, schizophrenia has remained something of an enigma. Scientists talk about dopamine and glutamate, nicotinic receptors and hippocampal atrophy, but they’ve made little progress in explaining psychosis as it unfolds on the level of thoughts, beliefs, and experiences. Approximately one percent of the world’s population suffers from schizophrenia. Add to that the comparable numbers of people who suffer from affective psychoses (certain types of bipolar disorder and depression) or psychosis from neurodegenerative disorders like Alzheimer’s disease. All told, upwards of 3% of the population have known psychosis first-hand. These individuals have experienced how it transformed their sensations, emotions, and beliefs. Why hasn’t science made more progress explaining this level of the illness? What have those slouches at the National Institute of Mental Health been up to?

There are several reasons why psychosis has proved a tough nut to crack. First and foremost, neuroscience is still struggling to understand the biology of complex phenomena like thoughts and memories in the healthy brain. Add to that the incredible diversity of psychosis: how one psychotic patient might be silent and unresponsive while another is excitable and talking up a storm. Finally, a host of confounding factors plague most studies of psychosis. Let’s say a scientist discovers that a particular brain area tends to be smaller in patients with schizophrenia than healthy controls. The difference might have played a role in causing the illness in these patients, it might be a direct result of the illness, or it might be the result of anti-psychotic medications, chronic stress, substance abuse, poor nutrition, or other factors that disproportionately affect patients.

So what’s a well-meaning neuroscientist to do? One intriguing approach is to study psychosis in healthy people. They don’t have the litany of confounding experiences and exposures that make patients such problematic subjects. Yet at first glance, the approach seems to have a fatal flaw. How can you study psychosis in people who don’t have it? It sounds as crazy as studying malaria in someone who’s never had the bug.

In fact, this approach is possible because schizophrenia is a very different illness from malaria or HIV. Unlike communicable diseases, it is a developmental illness triggered by both genetic and environmental factors. These factors affect us all to varying degrees and cause all of us – clinically psychotic or not – to land somewhere on a spectrum of psychotic traits. Just as people who don’t suffer from anxiety disorders can still differ in their tendency to be anxious, nonpsychotic individuals can differ in their tendency to develop delusions or have perceptual disturbances. One review estimates that 1 to 3% of nonpsychotic people harbor major delusional beliefs, while another 5 to 6% have less severe delusions. An additional 10 to 15% of the general population may experience milder delusional thoughts on a regular basis.

Delusions are a common symptom of schizophrenia and were once thought to reflect the poor reasoning abilities of a broken brain. More recently, a growing number of physicians and scientists have opted for a different explanation. According to this model, patients first experience the surprising and mysterious perceptual disturbances that result from their illness. These could be full-blown hallucinations or they could be subtler abnormalities, like the inability to ignore a persistent noise. Patients then adopt delusions in a natural (if misguided) attempt to explain their odd experiences.

An intriguing study from the early 1960s illustrates how rapidly delusions can develop in healthy subjects when expectations and perceptions inexplicably conflict. The study, run on twenty college students at the University of Copenhagen, involved a version of the trick now known as the rubber hand illusion. Each subject was instructed to trace a straight line while his or her hand was inside a box with a secret mirror. For several trials, the subject watched his or her own hand trace the line correctly. Then the experimenters surreptitiously changed the mirror position so that the subject was now watching someone else’s hand trace the straight line – until the sham hand unexpectedly veered off to the right! All of the subjects experienced the visible (sham) hand as their own and felt that an involuntary movement had sent it off course. After several trials with this misbehaving hand, the subjects offered explanations for the deviation. Some chalked it up to their own fatigue or inattention while others came up with wilder, tech-based explanations:

 . . . five subjects described that they felt something strange and queer outside themselves, which pressed their hand to the right or resisted their free mobility. They suggested that ‘magnets’, ‘unidentified forces’, ‘invisible traces under the paper’, or the like, could be the cause.

In other words, delusions may be a normal reaction to the unexpected and inexplicable. Under strange enough circumstances, anyone might develop them – but some of us are more likely to than others.

My next post will describe a clever experiment that planted a delusion-like belief in the heads of healthy subjects and used trickery and fMRI to see how it influenced some more than others. So stay tuned. In the meantime, you may want to ask yourself which members of your family and friends are prone to delusional thinking. Or ask yourself honestly: could it be you?

_______

Photo credit: MiniTar on Flickr, available through Creative Commons

%d bloggers like this: