Delusions: Making Sense of Mistaken Senses

6738201_646b9e485b_o

For a common affliction that strikes people of every culture and walk of life, schizophrenia has remained something of an enigma. Scientists talk about dopamine and glutamate, nicotinic receptors and hippocampal atrophy, but they’ve made little progress in explaining psychosis as it unfolds on the level of thoughts, beliefs, and experiences. Approximately one percent of the world’s population suffers from schizophrenia. Add to that the comparable numbers of people who suffer from affective psychoses (certain types of bipolar disorder and depression) or psychosis from neurodegenerative disorders like Alzheimer’s disease. All told, upwards of 3% of the population have known psychosis first-hand. These individuals have experienced how it transformed their sensations, emotions, and beliefs. Why hasn’t science made more progress explaining this level of the illness? What have those slouches at the National Institute of Mental Health been up to?

There are several reasons why psychosis has proved a tough nut to crack. First and foremost, neuroscience is still struggling to understand the biology of complex phenomena like thoughts and memories in the healthy brain. Add to that the incredible diversity of psychosis: how one psychotic patient might be silent and unresponsive while another is excitable and talking up a storm. Finally, a host of confounding factors plague most studies of psychosis. Let’s say a scientist discovers that a particular brain area tends to be smaller in patients with schizophrenia than healthy controls. The difference might have played a role in causing the illness in these patients, it might be a direct result of the illness, or it might be the result of anti-psychotic medications, chronic stress, substance abuse, poor nutrition, or other factors that disproportionately affect patients.

So what’s a well-meaning neuroscientist to do? One intriguing approach is to study psychosis in healthy people. They don’t have the litany of confounding experiences and exposures that make patients such problematic subjects. Yet at first glance, the approach seems to have a fatal flaw. How can you study psychosis in people who don’t have it? It sounds as crazy as studying malaria in someone who’s never had the bug.

In fact, this approach is possible because schizophrenia is a very different illness from malaria or HIV. Unlike communicable diseases, it is a developmental illness triggered by both genetic and environmental factors. These factors affect us all to varying degrees and cause all of us – clinically psychotic or not – to land somewhere on a spectrum of psychotic traits. Just as people who don’t suffer from anxiety disorders can still differ in their tendency to be anxious, nonpsychotic individuals can differ in their tendency to develop delusions or have perceptual disturbances. One review estimates that 1 to 3% of nonpsychotic people harbor major delusional beliefs, while another 5 to 6% have less severe delusions. An additional 10 to 15% of the general population may experience milder delusional thoughts on a regular basis.

Delusions are a common symptom of schizophrenia and were once thought to reflect the poor reasoning abilities of a broken brain. More recently, a growing number of physicians and scientists have opted for a different explanation. According to this model, patients first experience the surprising and mysterious perceptual disturbances that result from their illness. These could be full-blown hallucinations or they could be subtler abnormalities, like the inability to ignore a persistent noise. Patients then adopt delusions in a natural (if misguided) attempt to explain their odd experiences.

An intriguing study from the early 1960s illustrates how rapidly delusions can develop in healthy subjects when expectations and perceptions inexplicably conflict. The study, run on twenty college students at the University of Copenhagen, involved a version of the trick now known as the rubber hand illusion. Each subject was instructed to trace a straight line while his or her hand was inside a box with a secret mirror. For several trials, the subject watched his or her own hand trace the line correctly. Then the experimenters surreptitiously changed the mirror position so that the subject was now watching someone else’s hand trace the straight line – until the sham hand unexpectedly veered off to the right! All of the subjects experienced the visible (sham) hand as their own and felt that an involuntary movement had sent it off course. After several trials with this misbehaving hand, the subjects offered explanations for the deviation. Some chalked it up to their own fatigue or inattention while others came up with wilder, tech-based explanations:

 . . . five subjects described that they felt something strange and queer outside themselves, which pressed their hand to the right or resisted their free mobility. They suggested that ‘magnets’, ‘unidentified forces’, ‘invisible traces under the paper’, or the like, could be the cause.

In other words, delusions may be a normal reaction to the unexpected and inexplicable. Under strange enough circumstances, anyone might develop them – but some of us are more likely to than others.

My next post will describe a clever experiment that planted a delusion-like belief in the heads of healthy subjects and used trickery and fMRI to see how it influenced some more than others. So stay tuned. In the meantime, you may want to ask yourself which members of your family and friends are prone to delusional thinking. Or ask yourself honestly: could it be you?

_______

Photo credit: MiniTar on Flickr, available through Creative Commons

Modernity, Madness, and the History of Neuroscience

4666194636_a4d78d506e_o

I recently read a wonderful piece in Aeon Magazine about how technology shapes psychotic delusions. As the author, Mike Jay, explains:

Persecutory delusions, for example, can be found throughout history and across cultures; but within this category a desert nomad is more likely to believe that he is being buried alive in sand by a djinn, and an urban American that he has been implanted with a microchip and is being monitored by the CIA.

While delusional people of the past may have fretted over spirits, witches, demons and ghouls, today they often worry about wireless signals controlling their minds or hidden cameras recording their lives for a reality TV show. Indeed, reality TV is ubiquitous in our culture and experiments in remote mind-control (albeit on a limited scale) have been popping up recently in the news. As psychiatrist Joel Gold of NYU and philosopher Ian Gold of McGill University wrote in 2012: “For an illness that is often characterized as a break with reality, psychosis keeps remarkably up to date.”

Whatever the time or the place, new technologies are pervasive and salient. They are on the tips of our tongues and, eventually, at the tips of our fingers. Psychotic or not, we are all captivated by technological advances. They provide us with new analogies and new ways of explaining the all-but-unexplainable. And where else do we attempt to explain the mysteries of the world, if not through science?

As I read Jay’s piece on psychosis, it struck me that science has historically had the same habit of co-opting modern technologies for explanatory purposes. In the case of neuroscience, scientists and physicians across cultures and ages have invoked the  innovations of their day to explain the mind’s mysteries. For instance, the science of antiquity was rooted in the physical properties of matter and the mechanical interactions between them. Around 7th century BC, empires began constructing great aqueducts to bring water to their growing cities. The great engineering challenge of the day was to control and guide the flow of water across great distances. It was in this scientific milieu that the ancient Greeks devised a model for the workings of the mind. They believed that a person’s thoughts, feelings, intellect and soul were physical stuff: specifically, an invisible, weightless fluid called psychic pneuma. Around 200 AD, a physician and scientist of the Roman Empire (known for its masterful aqueducts) would revise and clarify the theory. The physician, Galen, believed that pneuma fills the brain cavities called ventricles and circulates through white matter pathways in the brain and nerves in the body just as water flows through a tube. As psychic pneuma traveled throughout the body, it carried sensation and movement to the extremities. Although the idea may sound farfetched to us today, this model of the brain persisted for more than a millennium and influenced Renaissance thinkers including Descartes.

By the 18th century, however, the science world was a-buzz with two strange new forces: electricity and magnetism. At the same time, physicians and anatomists began to think of the brain itself as the stuff that gives rise to thought and feeling, rather than a maze of vats and tunnels that move fluid around. In the 179os, Luigi Galvani’s experiments zapping frog legs showed that nerves communicate with muscles using electricity. So in the 19th century, just as inventors were harnessing electricity to run motors and light up the darkness, scientists reconceived the brain as an organ of electricity. It was a wise innovation and one supported by experiments, but also driven by the technical advances of the day.

Science was revolutionized once again with the advent of modern computers in the 1940s and ‘50s. In the 1950s, the new technology sparked a surge of research and theories that used the computer as an analogy for the brain. Psychologists began to treat mental events like computer processes, which can be broken up and analyzed as a set of discrete steps. They equated brain areas to processors and neural activity in these areas to the computations carried out by computers. Just as computers rule our modern technological world, this way of thinking about the brain still profoundly influences how neuroscience and psychology research is carried out and interpreted. Today, some labs cut out the middleman (the brain) entirely. Results from computer models of the brain are regularly published in neuroscience journals, sometimes without any data from an actual physical brain.

I’m sure there are other examples from the history of neuroscience in general and certainly from the history of science as a whole. Please comment and share any other ways that technology has shaped the models, themes, and analogies of science!

Additional sources:

Crivellato E & Ribatti D (2007) Soul, mind, brain: Greek philosophy and the birth of neuroscience. Brain Research Bulletin 71:327-336.

Karenberg A (2009) Cerebral Localization in the Eighteenth Century – An Overview. Journal of the History of the Neurosciences, 18:248-253.

_________

Photo Credit: dominiqueb on Flickr, available through Creative Commons

We Got the Beat

3937480966_26cd287141_o

It is both amusing and enlightening to hear my 21-month-old daughter sing the alphabet song. The song is her favorite, though she is years from grasping how symbols represent sound, not to mention the concept of alphabetical order. Still, if you start singing the song she will chime in. Before you think that’s impressive, keep in mind that her version of the song is more or less this: “CD . . . G . . . I . . . No P . . . S . . . V . . . Dub X . . . Z.”

Her alphabet song adds up to little more than a Scrabble hand, yet it is a surprising feat of memory all the same. My daughter doesn’t know her last name, can’t read or write, and has been known to mistake stickers for food. It turns out that her memory for the alphabet has far less to do with letters than lyrics. From Wheels on the Bus to Don’t Stop Believin’, she sings along to all of her favorite songs, piping up with every word and vowel she remembers. Her performance has nothing to do with comprehension; she has never seen or heard about a locker, yet she sings the word at just the right time in her rendition of the Glee song Loser like Me. (Go ahead and judge me. I judge myself.)

My daughter’s knack for learning lyrics is not unique to her or to toddlers in general. Adults are also far better at remembering words set to song than other strings of verbal material. That’s why college students have used music to memorize subjects from human anatomy to U.S. presidents. It’s why advertisers inundate you with catchy snippets of song. Who can forget a good jingle? To this day, I remember the phone number for a carpet company I saw advertised decades ago.

But what is it about music that helps us remember? And how does it work?

It turns out that rhythm, rather than melody, is the crucial component to remembering lyrics. In a 2008 study, subjects remembered unfamiliar lyrics far better if they heard them sung to a familiar melody (Scarborough Fair) than if they heard them sung to an unfamiliar song or merely spoken without music. But they remembered the lyrics better still if they heard the lines spoken to a rhythmic drummed arrangement of Scarborough Fair. Even an unfamiliar drummed rhythm boosted later memory for the words. By why should any of these conditions improve memory? According to the prevailing theory, lyrics have a structural framework that helps you learn and recall them. They are set to a particular melody through a process called textsetting that matches the natural beat and meter of the music and words. Composers, lyricists, and musicians do this by aligning the stressed syllables of words with strong beats in the music as much as possible. Music is also comprised of musical phrases; lyrics naturally break down into lines, or “chunks,” based on these phrase boundaries. And just in case you missed those boundaries, lyricists often emphasize the ends of these lines with a rhyming scheme.

Rhythm, along with rhyme and chunking, may be enough to explain the human knack for learning lyrics. Let’s say you begin singing that old classic, Twinkle, Twinkle, Little Star. You make it to “How I wonder,” but what’s next? Since the meter of the song is BUM bah BUM bah and you ended on bah, you know that the next words must have the stress pattern BUM bah. This helps limit your mental search for these words. (Oh yeah: WHAT you!) The final word in the line is a breeze, as it has to rhyme with “star.” And there you have it. Rhythm, along with rhyme and chunking, provide a sturdy scaffold for your memory of words.

For a more personal example of rhythm and memory, consider your own experience when you remember the alphabet. It’s worth noting that the alphabet song is set to a familiar melody (the same as Twinkle, Twinkle, Little Star and Baa, Baa, Black Sheep), a fact that surely helped you learn the alphabet lyrics in the first place. Now that you know them, ask yourself this: which comes first, the letter O or L? If you’re like me, you have to mentally run through the first half of the song to figure it out. Yet this mental rendition lacks a melody. Instead, you list the letters according to the song’s rhythm. Your list probably pauses after G and again after P and V, which each mark the end of a line in the song. The letters L, M, N, and O each last half as long as the average letter, while S sprawls out across twice the average. Centuries ago, a musician managed to squeeze the letters of the alphabet into the rhythm of an old French folk song. Today, the idiosyncratic pairing he devised remains alive – not just in kindergarten classrooms, but in the recesses of your brain. Its longevity, across generations and across the lifespan, illustrates how word and beat can be entwined in human memory.

While a rhythm-and-rhyme framework could explain the human aptitude for learning lyrics, there may be more to the story. As a 2011 study published in the Journal of Neuroscience shows, beat and meter have special representations in the brain. Participants in the study listened to a pure tone with brief beats at a rate of 144 per minute, or 2.4 Hz. Some of the participants were told to imagine one of two meters on top of the beat: either a binary meter (a march: BUM bah BUM bah BUM) or a ternary meter (a waltz: BUM bah bah BUM bah bah BUM). These meters divided the interval between beats into two or three evenly spaced intervals, respectively. A third group performed a control task that ensured subjects were paying attention to the sound without imagining a meter. All the while, the scientists recorded traces of neural activity that could be detected at the scalp with EEG.

The results were remarkable. Brain waves synchronized with the audible beat and with the imagined meters. This figure from the paper shows the combined and averaged data from the three experimental groups. The subjects in the control group (blue) heard the beat without imagining a meter; their EEGs showed strong brain waves at the frequency of the beat, 2.4 Hz. Both the march (red) and waltz (green) groups showed this 2.4 Hz rhythm plus increased brain waves at the frequency of their imagined meters (1.2 Hz and 0.8 Hz, respectively). The waltz group also showed another small peak of waves at 1.6 Hz, or twice the frequency of their imagined meter, a curiosity that may have as much to do with the mechanics of brain waves as the perception of meter and beat.

Screen Shot 2013-08-21 at 1.15.47 PMIn essence, these results show that beat and meter have a profound effect on the brain. They alter the waves of activity that are constantly circulating through your brain, but more remarkably, they do so in a way that syncs activity with sound (be it real or imagined). This phenomenon, called neural entrainment, may help you perceive rhythm by making you more receptive to sounds at the very moment when the next beat is due. It can also be a powerful tool for learning and memory. So far, only one group has tried to link brain waves to the benefits of learning words with music. Their papers have been flawed and inconclusive. Hopefully some intrepid scientist will forge ahead with this line of research. Until then, stay tuned. (Or should I say metered?)

Whatever the ultimate explanation, the cozy relationship between rhythm and memory may have left its mark on our cultural inheritance. Poetry predated the written word and once served the purpose of conveying epic tales across distances and generations. Singer-poets had to memorize a harrowing amount of verbal material. (Just imagine: the Iliad and Odyssey began as oral recitations and were only written down centuries later.) Scholars think poetic conventions like meter and rhyme arose out of necessity; how else could a person remember hours of text? The conventions persisted in poetry, song, and theater even after the written word became more widespread. No one can say why. But whatever the reason, Shakespeare’s actors would have learned their lines more quickly because of his clever rhymes and iambic pentameter. Mozart’s opera stars would have learned their libretti more easily because of his remarkable music. And centuries later you can sing along to Cyndi Lauper or locate Fifty Shades of Grey in the library stacks – all thanks to the rhythms of music and speech.

__________

Photo credits: David Martyn Hunt on Flickr and Nozaradan, Peretz, Missal & Mouraux via The Journal of Neuroscience

Nozaradan, S., Peretz, I., Missal, M., & Mouraux, A. (2011). Tagging the Neuronal Entrainment to Beat and Meter The Journal of Neuroscience DOI: 10.1523/JNEUROSCI.0411-11.2011

Memory: Up in Smoke?

002578cd_scan199_0199I recently joined a memory lab at Wayne State University. The timing seems fitting, as I’ve been doing a little memory experiment of my own of late. My father died ten years ago today and I’ve found myself wondering how my memory of him has fared over the decade. Which parts of him do I remember and which have I lost? They say we live on after we die, if nowhere else than in the memories of those we leave behind. Is it true, or does my father die a little each day as my brain cells age and adjust the strengths of their tiny connections?

I do, at least, remember how my father looked. Certain small details stick out in my memory – the wart beside his nose, his dulled gold wedding band beside a broad, flat knuckle, the remarkable definition of his calf muscles (thanks to his marathon bike rides). I can still see how he brushed his hair back from his face and how he crossed his legs – ankle to knee – and mopped up his sweat with a paper towel after a long ride. But are those the memories that matter? Do I remember how it felt to hug him? Do I remember all of the stories from his youth or any particular instance (of the many) when he said that he loved me? Not really. Not well enough to save him from oblivion.

I imagine I’m not the first person to experience the guilt of forgetting.

Unfortunately, memory loss picks up speed with the passage of time and the brain changes associated with old age. We will only ever have more to feel guilty about. But sometimes, on rare and bittersweet occasions, a chance encounter can trigger a memory we didn’t know we had. It is the psychological equivalent to finding coins wedged between the cushions of the couch and it happened to me a couple years back.

I was walking home from work when I smelled something. It was an odor I couldn’t identify, one that didn’t seem familiar, and yet it filled me with a sense of well-being. I stopped walking and inhaled deeply through my nose. What on earth was this compound? I spotted a man walking half a block ahead of me. He was a professor type with long white hair, a briefcase, and a trail of smoke fanning out behind him. The smell had to be coming from him, yet it was nothing like cigarette smoke.

I started walking again and then picked up the pace to get closer to the man. I’m not proud to say it, but I started to follow him, inhaling as I went. When he turned a corner I caught him in profile and saw that he was smoking a pipe. The intriguing smell was that of pipe smoke. For a moment I was confused. I didn’t recall having ever smelled someone smoking a pipe before and I find both cigar and cigarette smoke aversive.

Then I remembered hearing stories about my dad’s pipe. A professor type himself, my father smoked a pipe for many years and only gave up the habit after a triple bypass surgery. I was three years old at the time. Thanks to childhood amnesia, I don’t remember seeing or smelling my father with his pipe. Yet the memory of that smell, and the comfort I once associated it with, have been buried in my brain all these years like lost coins.

In theory, the memory isn’t a positive one. The secondhand smoke my brother and I inhaled early in life may have had something to do with the asthma we developed later in childhood. Still, my reaction to that stranger’s pipe smoke feels positive.  Precious, even. I’d like to think it reflects how I felt in those early years when I sat in my father’s lap or wrapped my fingers around those broad, flat knuckles. Contented and safe. And as a mother, I’d like to think that I’m planting the same warm feelings in my young daughter. Maybe someday after I’m gone an association will unearth them and she can revisit that innocent comfort all over again.

002578cd_scan46_0046

Even after I solved the mystery of the scent I followed the smoking stranger for a couple more blocks, inhaling and even closing my eyes as I experienced something of my father that I never knew I knew. It was hard to turn back for home. I didn’t want to lose him quite yet. I wasn’t ready. But then again no one ever is.

___

Photo credits: Sally Frye Schwarzlose

My Body or Yours?

liztan_bodiespic

Today we’re talking bodies. Not how they look in skinny jeans or whether they can win a Tour de France without steroids. We’re talking about how it feels to have a body of your own, one that is (or seems to be) conveniently connected to your head and neck.

I’ve written about body ownership before in the context of pregnancy. Although I focused on how I dreamt of my body during sleep, I also mentioned that my ballooning physical dimensions affected my coordination. I’d bump into countertops or doorways with my big belly and sometimes struggled to locate my center of gravity. Yet as strange as my new body was, it always felt like it belonged to me. This was an enormous blessing, of course, but it’s somewhat surprising  as well. After all, before my pregnancy I’d lived with the same body since puberty. After more than a decade and a half of experience with that body, I suddenly had to adjust to my new body in a matter of months. Or rather days, because that new body kept growing larger still. Although my belly would feel surreal at times, overall I had remarkably little trouble adjusting to my metamorphosis. The body was still mine in all its lumpy glory.

I was reminded of this experience recently when I came across a scientific paper about body swapping. I know it sounds as if the only science in something called body swapping must come from the term science fiction. Actually, body swapping is a remarkable perceptual illusion that requires nothing more than a second person, a set of head mounted cameras and a set of head mounted displays. Someone facing you wears the cameras mounted on a helmet and you wear the visual displays (which are presented to your two eyes like goggles as part of a virtual reality-style headset). The camera footage, filmed from the visual perspective of the second person, is fed directly into your visual display. Thus, you see your own body from the second person’s perspective.

But we haven’t made it to Freaky Friday just yet. The illusion requires something more. You and the other person take each other’s hands and begin squeezing them simultaneously. Nothing fancy. But in the words of the write up by Valeria Petkova and Henrik Ehrsson, this simple setup alone “. . . evoked a vivid illusion that the experimenter’s arm was the participant’s own arm and that the participants could sense their entire body just behind this arm. Most remarkably, the participants’ sensations of the tactile and muscular stimulation elicited by the squeezing of the hands seemed to originate from the experimenter’s hand, and not from their own clearly visible hand.”

So after a lifetime in your own body, it only takes a video feed and a few hand squeezes for you to make yourself at home in someone else’s arms and legs. If this setup sounds familiar, it is a more impressive incarnation of the classic rubber hand illusion. And a new and remarkable twist on the illusion just appeared in the news: scientists in the same lab have made people feel as if they have an invisible hand. (For a great discussion of this new illusion, read this.)

In science, we tend to think about human perception in general and illusions in particular in terms of adaptations and optimizations. Lots of visual illusions are based on the statistical probability of objects and events in our environment. Our brains learn to predict and extrapolate information about our settings because they jump to the likeliest conclusions. In this way illusions, while technically errors, often reveal clever shortcuts our brain takes to help us understand or parse our surroundings faster, better, or at less of an energy cost.

But what about the body swap? Since we never actually swap bodies, why should we mentally be able to do it? What’s the advantage? Well, the advantage seems to come down to the very fact that we never actually swap bodies. In our ever-changing world, a rare given is that you will have the same body tomorrow that you had today and yesterday. So why should your brain waste precious time or energy soliciting proof from every finger and toe, curve and joint, flex and bend? Take a smidge of visual evidence (in this case, the video display) and a dab of tactile confirmation (hand squeezing) and you have a recipe for body ownership. How often in the natural world would this recipe ever lead you astray?

So in essence you only think that you feel that you own your body. In truth, your brain is creating that sensation on the fly all the time. You could think of it as a philosophical conundrum or cause for an existential crisis. I prefer to think of it as good news for pregnant ladies everywhere.

_____

Photo credit: Elizabeth Tan

ResearchBlogging.org

Petkova VI, & Ehrsson HH (2008). If I Were You: Perceptual Illusion of Body Swapping PLOS One DOI: 10.1371/journal.pone.0003832

The End of History

Intersection 12-12-12 Day 347 G+ 365 Project 12 December 2012I just read a wonderful little article about how we think about ourselves. The paper, which came out in January, opens with a tantalizing paragraph that I simply have to share:

“At every stage of life, people make decisions that profoundly influence the lives of the people they will become—and when they finally become those people, they aren’t always thrilled about it. Young adults pay to remove the tattoos that teenagers paid to get, middle-aged adults rush to divorce the people whom young adults rushed to marry, and older adults visit health spas to lose what middle-aged adults visited restaurants to gain. Why do people so often make decisions that their future selves regret?”

To answer this question, the study’s authors recruited nearly 20,000 participants from the website of “a popular television show.” (I personally think they should have told us which one. I’d imagine there are differences between the people who flock to the websites for Oprah, The Nightly News, or, say, Jersey Shore.)

The study subjects ranged in age from 18 to 68 years of age. For the experiment, they had to fill out an online questionnaire about their current personality, core values, or personal preferences (such as favorite food). Half of the subjects—those in the reporter group—were then asked to report how they would have filled out the questionnaire ten years prior, while the other half—those in the predictor group—were asked to predict how they will fill it out ten years hence. For each subject, the authors computed the difference between the subject’s responses for his current self and those for his reported past self or predicted future self. And here’s the clever part: they could compare participants across ages. For example, they could compare how an 18-year-old’s prediction of his 28-year-old future self differed from a 28-year-old’s report of his 18-year-old self. It sounds crazy, but they did some great follow up studies to make sure the comparison was valid.

The results show a remarkable pattern. People believe that they have changed considerably in the past, even while they expect to change little in the future. And while they tend to be pretty accurate in their assessment of how much they’ve changed in years passed, they are grossly underestimating how much they will change in the coming years. The authors call this effect The End of History Illusion. And it’s not just found in shortsighted teenagers or twenty-somethings. While the study showed that older people do change less than younger people, they still underestimate how much they will continue to change in the decade to come.

The End of History Illusion is interesting in its own right. Why are we so illogical when reasoning about ourselves – and particularly, our own minds? We all understand that we will change physically as we age, both in how well our bodies function and how they look to others. Yet we deny the continued evolution (or devolution) of our traits, values, and preferences. We live each day as though we have finally achieved our ultimate selves. It is, in some ways, a depressing outlook. As much as we may like ourselves now, wouldn’t it be more heartening to believe that we will keep growing and improving as human beings?

The End of History Illusion also comes with a cost. We are constantly making flawed decisions for our future selves. As the paper’s opening paragraph illustrated, we take actions today under the assumption that our future desires and needs won’t change. In a follow up study, the authors even demonstrate this effect by showing that people would be willing to pay an average of $129 now to see a concert by their favorite band in ten years, while they would only be willing to pay an average of $80 now to see a concert by their favorite band from ten years back. Here, the illusion will only cost us money. In real life, it could cost us our health, our families, our future well-being.

This study reminded me of a book I read a while back called Stumbling on Happiness (written, it turns out, by the second author on this paper). The book’s central thesis is that we are bad at predicting what will make us happy and the whole thing is written in the delightful style of this paper’s opening paragraph. For those of you with the time, it’s worth a read. For those of you without time, I can only hope you’ll have more time in the future. With any luck we’ll all have more – more insight, more compassion, more happiness—in the decade to come.

____

Photo credit: Darla Hueske

ResearchBlogging.org

Quoidbach J, Gilbert DT, & Wilson TD (2013). TheEnd of History Illusion Science DOI: 10.1126/science.1229294

Feeling Invisible Light

7401773382_19963f6a8b_cIn my last post, I wrote about whether we can imagine experiencing a sense that we don’t possess (such as a trout’s sense of magnetic fields). Since then a study has come out that adds a new twist to our little thought experiment. And for that we can thank six trailblazing rats in North Carolina.

Like us, rats see only a sliver of the full electromagnetic spectrum. They can perceive red light with wavelengths as long as about 650 nanometers, but radiation with longer wavelengths (known as infrared, or IR, radiation) is invisible to them. Or it was before a group of researchers at Duke began their experiment. They first trained the rats to indicate with a nose poke where they saw a visible light turned on. Then the researchers mounted an IR detector to each rat’s head and surgically implanted tiny electrodes into the part of its brain that processes tactile sensations from its whiskers.

After these sci-fi surgeries, each rat was trained to do the same light detection task again – only this time it had to detect infrared instead of visible light. Whenever the IR detectors on the animal’s head picked up IR radiation, the electrodes stimulated the tactile whisker-responsive area of its brain. So while the rat’s eyes could not detect the IR lights, a part of its brain was still receiving information about them.

Could they do the new task? Not very well at first. But within a month, these adult rats learned to do the IR detection task quite well. They even developed new strategies to accomplish their new task; as these videos show, they learned to sweep their heads back and forth to detect and localize the infrared sources.

Overall, this study shows us that the adult brain is capable of acquiring a new or expanded sense. But it doesn’t tell us how the rats experienced this new sense. Two details from the study suggest that the rats experienced IR radiation as a tactile sensation. First, the post-surgical rats scratched at their faces when first exposed to IR radiation, just as they might if they initially interpreted the IR-related brain activity as something brushing against their whiskers. Second, when the scientists studied the activity of the touch neurons receiving IR-linked stimulation after extensive IR training, they found that the majority responded to both touch and infrared light. At least to some degree, the senses of touch and of infrared vision were integrated within the individual neurons themselves.

In my last post, I found that I was only able to imagine magnetosensation by analogy to my sense of touch. Using some fancy technology, the scientists at Duke were able to turn this exercise in imagination into a reality. The rats were truly able to experience a new sense by piggybacking on an existing sense. The findings demonstrate the remarkable plasticity of the adult brain – a comforting thought as we all barrel toward our later years – but they also provide us with a glimpse of future possibilities. Someday we might be able to follow up on our thought experiment with an actual experiment. With a little brain surgery, we may someday be able to ‘see’ infrared or ultraviolet light. Or we might just hook ourselves up to a magnificent compass and have a taste (or feel or smell or sight or sound) of magnetosensation after all.

____

Photo credit: Novartis AG

ResearchBlogging.org

Thomson EE, Carra R, & Nicolelis MA (2013). Perceiving invisible light through a somatosensory cortical prosthesis. Nature communications, 4 PMID: 23403583

%d bloggers like this: