My Body or Yours?

liztan_bodiespic

Today we’re talking bodies. Not how they look in skinny jeans or whether they can win a Tour de France without steroids. We’re talking about how it feels to have a body of your own, one that is (or seems to be) conveniently connected to your head and neck.

I’ve written about body ownership before in the context of pregnancy. Although I focused on how I dreamt of my body during sleep, I also mentioned that my ballooning physical dimensions affected my coordination. I’d bump into countertops or doorways with my big belly and sometimes struggled to locate my center of gravity. Yet as strange as my new body was, it always felt like it belonged to me. This was an enormous blessing, of course, but it’s somewhat surprising  as well. After all, before my pregnancy I’d lived with the same body since puberty. After more than a decade and a half of experience with that body, I suddenly had to adjust to my new body in a matter of months. Or rather days, because that new body kept growing larger still. Although my belly would feel surreal at times, overall I had remarkably little trouble adjusting to my metamorphosis. The body was still mine in all its lumpy glory.

I was reminded of this experience recently when I came across a scientific paper about body swapping. I know it sounds as if the only science in something called body swapping must come from the term science fiction. Actually, body swapping is a remarkable perceptual illusion that requires nothing more than a second person, a set of head mounted cameras and a set of head mounted displays. Someone facing you wears the cameras mounted on a helmet and you wear the visual displays (which are presented to your two eyes like goggles as part of a virtual reality-style headset). The camera footage, filmed from the visual perspective of the second person, is fed directly into your visual display. Thus, you see your own body from the second person’s perspective.

But we haven’t made it to Freaky Friday just yet. The illusion requires something more. You and the other person take each other’s hands and begin squeezing them simultaneously. Nothing fancy. But in the words of the write up by Valeria Petkova and Henrik Ehrsson, this simple setup alone “. . . evoked a vivid illusion that the experimenter’s arm was the participant’s own arm and that the participants could sense their entire body just behind this arm. Most remarkably, the participants’ sensations of the tactile and muscular stimulation elicited by the squeezing of the hands seemed to originate from the experimenter’s hand, and not from their own clearly visible hand.”

So after a lifetime in your own body, it only takes a video feed and a few hand squeezes for you to make yourself at home in someone else’s arms and legs. If this setup sounds familiar, it is a more impressive incarnation of the classic rubber hand illusion. And a new and remarkable twist on the illusion just appeared in the news: scientists in the same lab have made people feel as if they have an invisible hand. (For a great discussion of this new illusion, read this.)

In science, we tend to think about human perception in general and illusions in particular in terms of adaptations and optimizations. Lots of visual illusions are based on the statistical probability of objects and events in our environment. Our brains learn to predict and extrapolate information about our settings because they jump to the likeliest conclusions. In this way illusions, while technically errors, often reveal clever shortcuts our brain takes to help us understand or parse our surroundings faster, better, or at less of an energy cost.

But what about the body swap? Since we never actually swap bodies, why should we mentally be able to do it? What’s the advantage? Well, the advantage seems to come down to the very fact that we never actually swap bodies. In our ever-changing world, a rare given is that you will have the same body tomorrow that you had today and yesterday. So why should your brain waste precious time or energy soliciting proof from every finger and toe, curve and joint, flex and bend? Take a smidge of visual evidence (in this case, the video display) and a dab of tactile confirmation (hand squeezing) and you have a recipe for body ownership. How often in the natural world would this recipe ever lead you astray?

So in essence you only think that you feel that you own your body. In truth, your brain is creating that sensation on the fly all the time. You could think of it as a philosophical conundrum or cause for an existential crisis. I prefer to think of it as good news for pregnant ladies everywhere.

_____

Photo credit: Elizabeth Tan

ResearchBlogging.org

Petkova VI, & Ehrsson HH (2008). If I Were You: Perceptual Illusion of Body Swapping PLOS One DOI: 10.1371/journal.pone.0003832

Pb on the Brain

6865041631_7bdcf0cc44_o

I’ve got lead on my mind. Lead the element, not the verb; the toxic metal that used to grace every gas tank and paint can in this grand country of ours. For the most part we’ve stopped spewing lead into our environment, but the lead of prior generations doesn’t go away. It lingers on the walls and windows of older buildings, on floors as dust, and in the soil. These days it lingers in my thoughts as well.

I started worrying about lead when my daughter became a toddler and began putting everything in her mouth. I fretted more when I learned that lead is far more damaging to young children than was previously thought. Even a tiny amount of it can irreversibly harm a child’s developing brain, leading to lower IQs, attention problems and behavioral disorders. You may never even see the culprit; lead can sit around as microscopic dust, waiting to be inhaled or sucked off of an infant’s fingers.

Public health programs use blood lead levels (BLLs) to evaluate the amount of lead in a child’s system and decide whether to take preventative or medical action. In the 1960s, only BLLs above 60 μg/dL were considered toxic in children. That number has been creeping downward ever since. In 1985 the CDC’s stated blood lead level of concern became 25 μg/dL and in 1991 it went down to 10. But last year the CDC moved the cutoff down to 5 μg/dL and got rid of the term “level of concern.” That’s because scientists now believe that any amount of lead is toxic. In fact, it seems as if lead’s neurotoxic effects are most potent at BLLs below 5 μg/dL. In other words, a disproportionately large amount of the brain damage occurs at the lowest doses. Recent studies have shown subtle intellectual impairments in kids with BLLs as low as 2 μg/dL (which is roughly the mean BLL of American preschoolers today). All great reasons for parents to worry about even tiny exposures to lead, no?

Yes. Absolutely. Parents never want to handicap their children, even if only by an IQ point or two. But here’s what’s crazy: nearly every American in their fifties, forties, or late-thirties today would have clocked in well over the current CDC’s cutoff when they were little. The average BLL of American preschoolers in the late ‘70s was 15 μg/dL – and 88% had BLLs greater than 10 μg/dL.

These stats made me wonder if whole generations of Americans are cognitively and behaviorally impaired from lead poisoning as children. Have we been blaming our intellectually underwhelming workforce on a mismanaged education system, cultural complacency, or the rise of television and video games when we should have been blaming a toxic metal element?

I was sure I wasn’t the first person to wonder about the upshot of poisoning generations of Americans. And lo and behold, a quick Google search led me to this brilliant article on Mother Jones from January. The piece chronicles a rise in urban crime that began in the ‘60s and fell off precipitously in the early-to-mid ‘90s nationwide. The author, Kevin Drum, walks readers through very real evidence that lead fumes from leaded gasoline were a major cause of the rise in crime (and that increased regulation restricting lead in gasoline could be credited for the sudden drop off.)

The idea certainly sounds far-fetched: generations of city-dwellers were more prone to violence as adults because they breathed high levels of lead fumes when they were kids. It doesn’t seem possible. But when you put the pieces together it’s hard to imagine any other outcome. We know that children of the ‘50s, ‘60s, and ‘70s had BLLs high enough to cause irreversible IQ deficits and behavioral problems (of which aggression and impulse control are particularly common). Why is it so hard to imagine that more of these children behaved violently when they became adults?

In the end, this terrible human experiment in mass poisoning has left me pondering two particular questions. First, what does it mean for generations of children to be, in a sense, retroactively damaged by lead? At the time, our levels were considered harmless, but now we know better. Does knowing now, at this point, explain anything about recent history and current events? Does it explain the remarkable intransigence of certain politicians or the bellicosity of certain talk show hosts, athletes, or drivers with a road rage problem? Aside from the crime wave, what other sweeping societal trends might be credited to the poisoning of children past? How might history have played out differently if we had all been in our right minds?

Finally, I’ve been thinking a lot about the leads and asbestoses and thalidomides of today. Pesticides? Bisphenol A? Flame retardants? What is my daughter licking off of those toys of hers and how is it going to harm her twenty years down the line? This is not just a question for parents. Think crime waves. Think lost productivity and innovation. Today’s children grow up to be tomorrow’s adults. Someday when we are old and convalescing they’ll take the reigns of our society and drive it heaven-knows-where. That makes child health and safety an issue for us all. We may never even know how much we stand to lose.

_____

Photo credit: Zara Evens

Cuddling Up with a Scimoir

1688897198_28302e8ce6_oYou might call it a Frankenstein genre – two quite different literary genres stitched together and brought to life. For the moment, I am calling it the scimoir. The rare science memoir can be found tucked away in the Science section, in Memoir or Biography, even sometimes in Health, Psychology, or Self Help. It defies categorization, flummoxing librarians and booksellers alike. Science and memoir, memoir and science. It just doesn’t seem right.

At first glance the two genres seem incompatible. Science is the study of the immutable and absolute while memoir is the most personal and subjective of all genres. Yet somehow they can go together, and when done well, they resonate with honesty and relevance. They tame each other. Memoir reminds us that the whirring mechanics of science play out on the scale of our individual lives, while science reminds us that the memoirists’ struggles and stories reflect something of the universal. Moreover, the drama of memoir adds the narrative kick that science writing so desperately needs. It’s a match made in genre heaven.

Why am I waxing poetic about a literary genre? I suppose because I recently discovered that I’m drawn to this combination, both as a blogger and as a reader. The majority of my posts are amalgamations of personal experience and scientific theory. This was never my intent; somehow the combination fell out of my interests and whatever spark motivated me to write about a given topic. I’ve also discovered that I’ve read and enjoyed a number of scimoirs, even though I didn’t consciously seek them out and scimoirs are none too common.

In point of fact, I shouldn’t be surprised that book-length scimoirs are relatively rare. To write a compelling one, an author generally has to be a scientist or science writer who has also personally experienced something dramatic that is relevant to the topic. You might be both a leading researcher and lifelong sufferer of a particular illness, like Kay Redfield Jamison in An Unquiet Mind. You might be the researcher behind an infamous experiment, like Philip Zimbardo in The Lucifer Effect. Or you might be able to approach the topic through your experience with ailing relatives. In Mapping Fate, Alice Wexler wrote about her mother’s battle with Huntington’s disease and her sister’s scientific quest to isolate the culprit gene. In Acquainted with the Night, the science writer Paul Raeburn documented his children’s struggles with mental illness in the context of the current state of juvenile psychiatric knowledge and treatment.

I am on a quest to identify other books in this wonderful Franken-genre and I need your help. Here are the other scimoirs I can think of that I’ve already read (aside from those listed above): My Stroke of Insight by Jill Bolte Taylor, The Double Helix by James Watson, A Primate’s Memoir by Robert Sapolsky, and several of Oliver Sacks’s books. I’ve come across a few more that I plan to read: Memoirs of an Addicted Brain by Marc Lewis, Moonwalking with Einstein by Joshua Foer, and What Mad Pursuit by Francis Crick.

Please let me know what other scimoirs you’ve read, want to read, or simply know are out there. And do share any other ideas for naming the genre. Scimoir sounds like a half-android, half-alien monster, and who wants to cuddle up with that?

______

Photo credit: Karoly Czifra

The End of History

Intersection 12-12-12 Day 347 G+ 365 Project 12 December 2012I just read a wonderful little article about how we think about ourselves. The paper, which came out in January, opens with a tantalizing paragraph that I simply have to share:

“At every stage of life, people make decisions that profoundly influence the lives of the people they will become—and when they finally become those people, they aren’t always thrilled about it. Young adults pay to remove the tattoos that teenagers paid to get, middle-aged adults rush to divorce the people whom young adults rushed to marry, and older adults visit health spas to lose what middle-aged adults visited restaurants to gain. Why do people so often make decisions that their future selves regret?”

To answer this question, the study’s authors recruited nearly 20,000 participants from the website of “a popular television show.” (I personally think they should have told us which one. I’d imagine there are differences between the people who flock to the websites for Oprah, The Nightly News, or, say, Jersey Shore.)

The study subjects ranged in age from 18 to 68 years of age. For the experiment, they had to fill out an online questionnaire about their current personality, core values, or personal preferences (such as favorite food). Half of the subjects—those in the reporter group—were then asked to report how they would have filled out the questionnaire ten years prior, while the other half—those in the predictor group—were asked to predict how they will fill it out ten years hence. For each subject, the authors computed the difference between the subject’s responses for his current self and those for his reported past self or predicted future self. And here’s the clever part: they could compare participants across ages. For example, they could compare how an 18-year-old’s prediction of his 28-year-old future self differed from a 28-year-old’s report of his 18-year-old self. It sounds crazy, but they did some great follow up studies to make sure the comparison was valid.

The results show a remarkable pattern. People believe that they have changed considerably in the past, even while they expect to change little in the future. And while they tend to be pretty accurate in their assessment of how much they’ve changed in years passed, they are grossly underestimating how much they will change in the coming years. The authors call this effect The End of History Illusion. And it’s not just found in shortsighted teenagers or twenty-somethings. While the study showed that older people do change less than younger people, they still underestimate how much they will continue to change in the decade to come.

The End of History Illusion is interesting in its own right. Why are we so illogical when reasoning about ourselves – and particularly, our own minds? We all understand that we will change physically as we age, both in how well our bodies function and how they look to others. Yet we deny the continued evolution (or devolution) of our traits, values, and preferences. We live each day as though we have finally achieved our ultimate selves. It is, in some ways, a depressing outlook. As much as we may like ourselves now, wouldn’t it be more heartening to believe that we will keep growing and improving as human beings?

The End of History Illusion also comes with a cost. We are constantly making flawed decisions for our future selves. As the paper’s opening paragraph illustrated, we take actions today under the assumption that our future desires and needs won’t change. In a follow up study, the authors even demonstrate this effect by showing that people would be willing to pay an average of $129 now to see a concert by their favorite band in ten years, while they would only be willing to pay an average of $80 now to see a concert by their favorite band from ten years back. Here, the illusion will only cost us money. In real life, it could cost us our health, our families, our future well-being.

This study reminded me of a book I read a while back called Stumbling on Happiness (written, it turns out, by the second author on this paper). The book’s central thesis is that we are bad at predicting what will make us happy and the whole thing is written in the delightful style of this paper’s opening paragraph. For those of you with the time, it’s worth a read. For those of you without time, I can only hope you’ll have more time in the future. With any luck we’ll all have more – more insight, more compassion, more happiness—in the decade to come.

____

Photo credit: Darla Hueske

ResearchBlogging.org

Quoidbach J, Gilbert DT, & Wilson TD (2013). TheEnd of History Illusion Science DOI: 10.1126/science.1229294

Feeling Invisible Light

7401773382_19963f6a8b_cIn my last post, I wrote about whether we can imagine experiencing a sense that we don’t possess (such as a trout’s sense of magnetic fields). Since then a study has come out that adds a new twist to our little thought experiment. And for that we can thank six trailblazing rats in North Carolina.

Like us, rats see only a sliver of the full electromagnetic spectrum. They can perceive red light with wavelengths as long as about 650 nanometers, but radiation with longer wavelengths (known as infrared, or IR, radiation) is invisible to them. Or it was before a group of researchers at Duke began their experiment. They first trained the rats to indicate with a nose poke where they saw a visible light turned on. Then the researchers mounted an IR detector to each rat’s head and surgically implanted tiny electrodes into the part of its brain that processes tactile sensations from its whiskers.

After these sci-fi surgeries, each rat was trained to do the same light detection task again – only this time it had to detect infrared instead of visible light. Whenever the IR detectors on the animal’s head picked up IR radiation, the electrodes stimulated the tactile whisker-responsive area of its brain. So while the rat’s eyes could not detect the IR lights, a part of its brain was still receiving information about them.

Could they do the new task? Not very well at first. But within a month, these adult rats learned to do the IR detection task quite well. They even developed new strategies to accomplish their new task; as these videos show, they learned to sweep their heads back and forth to detect and localize the infrared sources.

Overall, this study shows us that the adult brain is capable of acquiring a new or expanded sense. But it doesn’t tell us how the rats experienced this new sense. Two details from the study suggest that the rats experienced IR radiation as a tactile sensation. First, the post-surgical rats scratched at their faces when first exposed to IR radiation, just as they might if they initially interpreted the IR-related brain activity as something brushing against their whiskers. Second, when the scientists studied the activity of the touch neurons receiving IR-linked stimulation after extensive IR training, they found that the majority responded to both touch and infrared light. At least to some degree, the senses of touch and of infrared vision were integrated within the individual neurons themselves.

In my last post, I found that I was only able to imagine magnetosensation by analogy to my sense of touch. Using some fancy technology, the scientists at Duke were able to turn this exercise in imagination into a reality. The rats were truly able to experience a new sense by piggybacking on an existing sense. The findings demonstrate the remarkable plasticity of the adult brain – a comforting thought as we all barrel toward our later years – but they also provide us with a glimpse of future possibilities. Someday we might be able to follow up on our thought experiment with an actual experiment. With a little brain surgery, we may someday be able to ‘see’ infrared or ultraviolet light. Or we might just hook ourselves up to a magnificent compass and have a taste (or feel or smell or sight or sound) of magnetosensation after all.

____

Photo credit: Novartis AG

ResearchBlogging.org

Thomson EE, Carra R, & Nicolelis MA (2013). Perceiving invisible light through a somatosensory cortical prosthesis. Nature communications, 4 PMID: 23403583