Cuddling Up with a Scimoir

1688897198_28302e8ce6_oYou might call it a Frankenstein genre – two quite different literary genres stitched together and brought to life. For the moment, I am calling it the scimoir. The rare science memoir can be found tucked away in the Science section, in Memoir or Biography, even sometimes in Health, Psychology, or Self Help. It defies categorization, flummoxing librarians and booksellers alike. Science and memoir, memoir and science. It just doesn’t seem right.

At first glance the two genres seem incompatible. Science is the study of the immutable and absolute while memoir is the most personal and subjective of all genres. Yet somehow they can go together, and when done well, they resonate with honesty and relevance. They tame each other. Memoir reminds us that the whirring mechanics of science play out on the scale of our individual lives, while science reminds us that the memoirists’ struggles and stories reflect something of the universal. Moreover, the drama of memoir adds the narrative kick that science writing so desperately needs. It’s a match made in genre heaven.

Why am I waxing poetic about a literary genre? I suppose because I recently discovered that I’m drawn to this combination, both as a blogger and as a reader. The majority of my posts are amalgamations of personal experience and scientific theory. This was never my intent; somehow the combination fell out of my interests and whatever spark motivated me to write about a given topic. I’ve also discovered that I’ve read and enjoyed a number of scimoirs, even though I didn’t consciously seek them out and scimoirs are none too common.

In point of fact, I shouldn’t be surprised that book-length scimoirs are relatively rare. To write a compelling one, an author generally has to be a scientist or science writer who has also personally experienced something dramatic that is relevant to the topic. You might be both a leading researcher and lifelong sufferer of a particular illness, like Kay Redfield Jamison in An Unquiet Mind. You might be the researcher behind an infamous experiment, like Philip Zimbardo in The Lucifer Effect. Or you might be able to approach the topic through your experience with ailing relatives. In Mapping Fate, Alice Wexler wrote about her mother’s battle with Huntington’s disease and her sister’s scientific quest to isolate the culprit gene. In Acquainted with the Night, the science writer Paul Raeburn documented his children’s struggles with mental illness in the context of the current state of juvenile psychiatric knowledge and treatment.

I am on a quest to identify other books in this wonderful Franken-genre and I need your help. Here are the other scimoirs I can think of that I’ve already read (aside from those listed above): My Stroke of Insight by Jill Bolte Taylor, The Double Helix by James Watson, A Primate’s Memoir by Robert Sapolsky, and several of Oliver Sacks’s books. I’ve come across a few more that I plan to read: Memoirs of an Addicted Brain by Marc Lewis, Moonwalking with Einstein by Joshua Foer, and What Mad Pursuit by Francis Crick.

Please let me know what other scimoirs you’ve read, want to read, or simply know are out there. And do share any other ideas for naming the genre. Scimoir sounds like a half-android, half-alien monster, and who wants to cuddle up with that?

______

Photo credit: Karoly Czifra

The End of History

Intersection 12-12-12 Day 347 G+ 365 Project 12 December 2012I just read a wonderful little article about how we think about ourselves. The paper, which came out in January, opens with a tantalizing paragraph that I simply have to share:

“At every stage of life, people make decisions that profoundly influence the lives of the people they will become—and when they finally become those people, they aren’t always thrilled about it. Young adults pay to remove the tattoos that teenagers paid to get, middle-aged adults rush to divorce the people whom young adults rushed to marry, and older adults visit health spas to lose what middle-aged adults visited restaurants to gain. Why do people so often make decisions that their future selves regret?”

To answer this question, the study’s authors recruited nearly 20,000 participants from the website of “a popular television show.” (I personally think they should have told us which one. I’d imagine there are differences between the people who flock to the websites for Oprah, The Nightly News, or, say, Jersey Shore.)

The study subjects ranged in age from 18 to 68 years of age. For the experiment, they had to fill out an online questionnaire about their current personality, core values, or personal preferences (such as favorite food). Half of the subjects—those in the reporter group—were then asked to report how they would have filled out the questionnaire ten years prior, while the other half—those in the predictor group—were asked to predict how they will fill it out ten years hence. For each subject, the authors computed the difference between the subject’s responses for his current self and those for his reported past self or predicted future self. And here’s the clever part: they could compare participants across ages. For example, they could compare how an 18-year-old’s prediction of his 28-year-old future self differed from a 28-year-old’s report of his 18-year-old self. It sounds crazy, but they did some great follow up studies to make sure the comparison was valid.

The results show a remarkable pattern. People believe that they have changed considerably in the past, even while they expect to change little in the future. And while they tend to be pretty accurate in their assessment of how much they’ve changed in years passed, they are grossly underestimating how much they will change in the coming years. The authors call this effect The End of History Illusion. And it’s not just found in shortsighted teenagers or twenty-somethings. While the study showed that older people do change less than younger people, they still underestimate how much they will continue to change in the decade to come.

The End of History Illusion is interesting in its own right. Why are we so illogical when reasoning about ourselves – and particularly, our own minds? We all understand that we will change physically as we age, both in how well our bodies function and how they look to others. Yet we deny the continued evolution (or devolution) of our traits, values, and preferences. We live each day as though we have finally achieved our ultimate selves. It is, in some ways, a depressing outlook. As much as we may like ourselves now, wouldn’t it be more heartening to believe that we will keep growing and improving as human beings?

The End of History Illusion also comes with a cost. We are constantly making flawed decisions for our future selves. As the paper’s opening paragraph illustrated, we take actions today under the assumption that our future desires and needs won’t change. In a follow up study, the authors even demonstrate this effect by showing that people would be willing to pay an average of $129 now to see a concert by their favorite band in ten years, while they would only be willing to pay an average of $80 now to see a concert by their favorite band from ten years back. Here, the illusion will only cost us money. In real life, it could cost us our health, our families, our future well-being.

This study reminded me of a book I read a while back called Stumbling on Happiness (written, it turns out, by the second author on this paper). The book’s central thesis is that we are bad at predicting what will make us happy and the whole thing is written in the delightful style of this paper’s opening paragraph. For those of you with the time, it’s worth a read. For those of you without time, I can only hope you’ll have more time in the future. With any luck we’ll all have more – more insight, more compassion, more happiness—in the decade to come.

____

Photo credit: Darla Hueske

ResearchBlogging.org

Quoidbach J, Gilbert DT, & Wilson TD (2013). TheEnd of History Illusion Science DOI: 10.1126/science.1229294

Locked Away

The results are in. The ultrasound was conclusive. And despite my previously described hunch that our growing baby is a boy, she turned out to be a girl. We are, of course, ecstatic. A healthy baby and a girl to boot! As everyone tells us, girls are simply more fun.

As I was reading in my pregnancy book the other day, I came across an interesting bit of trivia about baby girls. At this point in my pregnancy (nearly 6 months in), our baby’s ovaries contain all the eggs she’ll have for her entire life. As I mentioned in a prior post, the fact that a female fetus develops her lifetime supply of eggs in utero represents a remarkable transgenerational link. In essence, half of the genetic material that makes up my growing baby already existed inside my mother when she was pregnant. And now, inside me, exists half of the genetic material that will become all of the grandchildren I will ever have. This is the kind of link that seems to mix science and spirituality, that reminds us that, though we are a mere cluster of cells, there’s a poetry to the language of biology and Life.

But after stumbling upon this factoid about our baby’s eggs, I was also struck by a sense that somewhere someone seemed to have his or her priorities mixed up. If our baby were born today, she would have a slim chance of surviving. Her intestines, cerebral blood vessels, and retinas are immature and not ready for life outside the womb. Worse still, the only shot her lungs would have at functioning is with the aid of extreme medical intervention. The order of it all seems crazy. My baby is equipped with everything she’ll need to reproduce decades in the future, yet she lacks the lung development to make it five minutes in the outside world. What was biology thinking?

Then I remembered two delightful popular science books I’d read recently, The Red Queen by Matt Ridley and Life Ascending by Nick Lane. Both described the Red Queen Hypothesis of the evolution of sex, which states that the reason so much of the animal kingdom reproduces sexually (rather than just making clones of itself) is to ‘outwit’ parasites. In short, if each generation of humans were the same as the next, parasites large and microbial could evolve to overtake us. By mixing up our genetic makeup through sexual reproduction, we make it harder for illnesses to wipe us out. Like the Red Queen from Lewis Carroll’s classic, we keep running in order to stay in the same place (which is one step ahead of parasites and disease).

Just as there are parasitic organisms and bacteria, one might say that there are parasitic genes. For example, mutations in the DNA of our own replicating cells can cause cancer, which is essentially a self-made, genetic parasite. Moreover, retroviruses like HIV are essentially bits of genetic material that invade our bodies and can insert themselves into the DNA of our cells. And the ultimate road to immortality for a parasitic gene would be to hitch a ride on the back of reproduction. Imagine what an easy life that would be! If a retrovirus could invade the eggs in the ovaries, it would be passed on from one generation to the next without doing one iota of work. It’s the holy grail of parasitic invasion – get thee to the ovaries! According to Matt Ridley in another of his books, The Origins of Virtue, the human germ line is segregated from the rest of the growing embryo by 56 days after fertilization. Within two months of conception, the cells that will give rise to all of the embryo’s eggs (or sperm, in males) are already cordoned off. They are kept safe until they are needed many years in the future.

So perhaps my little baby’s development isn’t as backwards as it seemed at first. Yes, lungs are important. But when you’ve got something of value to others, it makes practical sense to hurry up and lock it away.

Good Morning, Sleepyhead

A few weeks ago, I passed out. One moment I was standing by the door to our apartment, wishing my departing husband a good day at work. The next, my eyes had rolled back in my head and I fell face-first into the wall. My forehead struck the lower hinges of the door; I bruised my cheek and arm and knee, nothing badly. My husband, who was halfway out the door when I fell, rushed to gather me up. He held me and said, “Are you all right? Are you okay?” And that was how I awoke, as if from a long dreamless sleep, on the floor beside our front door.

I was only out a few seconds, but it felt like it could have been hours. I remembered the minutes leading up to my dramatic tumble, but they felt like long ago. A bit ethereal, and separated from the present by a gap that didn’t feel odd to me in the slightest.

I’ve always tended toward low blood pressure and often felt dizzy when standing up. After the fall, doctors checked me out and said I was fine. (My prescriptions are to drink more water and maybe eat more salt.) Still, the experience got me thinking about memory and how it’s a strange and elusive creature. How we always think we’ve caught it but we never have.

Back in my grad school days, we studied the case of H.M., the famous amnesic patient who was unable to form new memories. We learned that his journal was filled with descriptions of waking up as if for the first time and having no recollection of writing any of the prior journal entries, nor of how he came to be where he was. I wonder if the feeling was something like my contradictory experience on the floor, when I lacked memory of the preceding moments and yet felt as if nothing were missing. Time felt continuous, despite the fact that my memory was not.

The experience also reminded me of a dramatic story I read in the nonfiction book Soul Made Flesh. In 1650, a young British servant named Anne Green was seduced by her master’s grandson and gave birth to a stillborn baby. Thanks to the social mores of the time, she was tried and convicted of infanticide and sentenced to death. She proclaimed her innocence to the crowd that gathered in the courtyard of Oxford Castle to watch her hanging. After her speech, the executioner kicked the ladder out from under her and she hanged for almost half an hour before they cut her down and sent her body down the street to be dissected for science. Her designated dissectors were Drs. William Petty and Thomas Willis (of the Circle of Willis). But when they opened the coffin, they heard a rattle in her throat and managed to revive her with water, heat, and herbs.

When Anne Green came to, she began reciting the speech she’d delivered at the gallows. She didn’t remember leaving the prison, climbing the ladder, or giving the speech, much less (thankfully) hanging. A pamphlet later circulated about the event described her memory as “a clock whose weights had been taken off a while and afterward hung on again.” The incident illustrated the machine-like quality of memory. Today we describe it as flipping a switch. Anne Green’s memory had been turned off and then turned on again.

As strange as the stories of H.M. and Anne Green sound, their wild memory lapses aren’t so different from what happens to us everyday. We all experience time as continuous and ongoing, even though our memory is often shot through with holes. We spend a full third of our lives in unconscious slumber and remember little of our dreams. Even our waking lives are terribly preserved in the vault of our memory. How many of your breakfasts can you recall? How many birthday parties and drives to work? How many classroom lectures and airplane rides and showers can you individually call to mind?

Our recollections are mere fragments. They pepper the timeline of our past just enough to form a narrative – one’s life story. This story may feel solid and unbroken, but don’t kid yourself. Your memory is not. We are all amnesic, all a little untethered from the passing moments of our lives. We are continually rediscovering and resurrecting our past to move forward in the present. In one way or another, we have all roused from our coffin reciting a speech from the gallows or come to on the floor with a sore face and an astonished husband. We are all perpetually in the process of waking up for the very first time.

Me, You, and Lucifer

prison_bars1

Are we all capable of doing truly evil things? This was a question posed by my latest nonfiction read, The Lucifer Effect. The author, Philip Zimbardo, is the psychologist who created the infamous Stanford Prison Experiment in 1971.

For his prison experiment, Zimbardo randomly assigned young college men to play the roles of prisoners or guards in a mock prison set up in the basement of the Stanford Psychology building. The study took place in an era of anti-war protests, when college students were being arrested and thrown into jail. Zimbardo conceived the experiment with the purpose of examining the prisoners’ mentalities and their attempts to organize and rebel. However ultimately, the most fascinating aspect of the experiment was the behavior of the subjects who played guards. Well-behaved, emotionally stable college boys became cruel guards. They inflicted shocking degradations on the prisoners, causing more than one of their detainees to suffer an emotional breakdown.

This unethical but fascinating experiment, and the other psychology studies and world events that Zimbardo chronicles in his book, demonstrate how we are all actors. We conform easily to the roles in which we are cast, even if these roles involve harming others or allowing others to harm us. And yet we cling to the concept of our unique and personal identity, particularly in Western cultures where individualism is highly prized.

This human failing, our malleability to fit social norms, is a consequence of one of our greatest attributes, our resourcefulness. We have evolved to be successful in complex environments and to tailor our behavior to the people and circumstances that surround us. Think about how you behave when you attend a party where you know no one compared with a night out with your oldest friends. Think of how you conduct yourself at a job interview, at a sporting event, babysitting a child, or when you’re alone. We manage to be very different people from moment to moment. We have to be; our complex social world demands it. So why is it so hard to imagine that, when plunged into an extreme role under extreme circumstances, we might do something we’d never think we’re capable of? Something truly inhumane?

In his book, Zimbardo argues that the suicide bombers of modern religious extremism, the torturers in the Abu Ghraib prison, and the executioners of the Holocaust were normal people subjected to extreme pressures and circumstances. There is a long history of psychology experiments, including Zimbardo’s study and the infamous Milgram experiments, that demonstrate how stable, well-meaning Americans will commit terrible acts when influenced by authority or anonymity. And past events have shown time and again that we are capable of standing by rather than intervening to help those in need. The case of Kitty Genovese is a dramatic example of this common occurrence, the basis for the so-called bystander effect. Zimbardo calls it the evil of inaction. But should it come as a surprise? Doesn’t society teach us to mind our own business?

These questions about complicity and inaction reminded me of a night from my childhood. I was seven years old when my parents first took me to the Chicago Symphony Orchestra. I remember how people with shiny shoes, lush coats, and clutch purses poured in from nearby parking structures to gather outside Orchestra Hall. Amid the symphony goers, I noticed a crumpled man sitting on the ground with his head resting on his knees. I remember that his jacket was far too thin for the biting cold and that the plastic cup by his feet was empty. All of the well-dressed concert goers swept past him, including us. I felt ill throughout the first half of the performance, knowing I could have asked my parents for money and given him a dollar or two. I worried that he’d starve to death because I hadn’t done anything to help him. At intermission, I dragged my parents out to the sidewalk, but by then the man was gone.

As an adult, I’ve lived in big cities and have become accustomed to sharing sidewalks and street corners with the homeless. People could not function in urban settings if they became as paralyzed by the sight of suffering as my seven-year-old self was. In the case of homelessness, it’s hard to know what to do or how to help; giving money to people on the street is not necessarily the best thing. But I do think our reaction to homelessness offers clues to how experience and social norms allow us to ignore or overlook suffering. To live and function in a city, we have to curb our empathy and compassion. Even if we aren’t cruel, even if we mean well, we manage to turn off our humanity on a daily basis. And the fact that we can turn it off a little should serve as a reminder that we all have the capability to turn it off a lot.

There are things we can do to help address the problem of homelessness, such as volunteering or donating money to a local homeless shelter. Charitable donations and philanthropic acts improve our world and reduce suffering, of course, and we should do both – as often as we can. But we should also remember that good deeds don’t change our capacity for evil. The best way to prevent ourselves from being cruel or complicit in cruelty is to believe that we are capable of it. If we remember that we are susceptible to the pressures of authority, group norms, and social roles, we can be vigilant. We can stop ourselves. We can speak up. We can act.

Jaded

In December 2008, I stared up at one of the great marvels of the world, the gleaming Taj Mahal. And I felt – nothing. Curiosity about its fabled history, yes. But other than that, all I felt was ambivalence about posing for pictures in its imposing foreground and a certain reluctance to leave my shoes unattended as I toured the palace itself.

I should have been awestruck. The Taj Mahal is stunning, a brilliant feat of engineering and craftsmanship, design and artistic grandeur. But the problem was, this wasn’t the first time I’d seen it, or even the second. Over the years, I’d seen the iconic structure in countless photographs, documentaries, and movies. By 2008, I’d encountered the great edifice so many times from the comfort of my couch that now, having traveled halfway around the world to gaze upon it, I was wondering what we would have for lunch.

It’s shameful, I know. But I suspect I’m not the only guilty one.

Recently, a friend told me why she couldn’t stand modern literature. “I hate the descriptions,” she said. “They’re flowery and over-blown and just plain weird.” Although I enjoy contemporary fiction, I knew what she was referring to. While authors of the past could devote full paragraphs to describing fields in bloom or dank urban alleys, they generally used concrete, sensible words. Contemporary writers tend to rely heavily on metaphors, or else they describe things in odd, non-literal ways. In her novel A Gate at the Stairs, Lorrie Moore uses the term “a papery caramel of leaves” to describe the wet waste that lined the roads. Whoever thought of soggy, caked leaves as caramel? And yet I think the description gives us something – a sense of color, of texture, and a fresh perspective.

It occurred to me that modern writers are faced with an interesting challenge, namely jaded readers who have seen (if not experienced) it all. Readers like me who can look upon the Taj Mahal without being awestruck. Not only are we more well-traveled than days of yore, but we’re exposed to places all over the world by way of screens large and small. In movies and through television we have seen rainforests and polar expeditions, villages from Scotland to Africa to Guatemala, Texas rodeos, Manhattan sex clubs, Roman amphitheaters, ocean floors, mountain peaks, and even the surface of the moon. No wonder we’re jaded. And no wonder fiction writers today have to sweat and toil to describe the world in a different way if we are to take note of it at all.

I’m torn about the vicarious exposure we get to our world through TV and movies. It’s a strange sort of life without living, experience that is like reality without actually being real. On the one hand, it gives us access to other places, times, and ways of life, showing us things we may never otherwise see. It can educate us, but I think it also steals something from us – the freshness and newness of discovery. I don’t want to be jaded, so I’m going to take this as a challenge. I’m going to push myself to experience each new surrounding fully, to open my eyes and look. More than that, I’m going to challenge myself to touch, taste, and smell the world around me. As yet, technology doesn’t stimulate those senses in our living rooms and movie theaters, which means the real world has got that market cornered.

Science and Literature: Strange Bedfellows?

Something I’ve been thinking about lately: what do science and literature have in common? On the face of it, nothing. One is dedicated to making stuff up; the other is all about not making stuff up. I would have abandoned the question, or probably wouldn’t have asked it at all, if it weren’t for the fact that these two fields have been the intellectual passions of my life. Am I a splintered human being, or is there something that unites them?

Science is fundamentally a list of rules, like a lengthy version of the Ten Commandments. However, these rules dictate, down to the most minute of scales, how our universe IS. And as boring or unintuitive as each rule may be, their interplay and repercussions are stunning. I think scientists are drawn to the field because they appreciate this beauty and because they want to uncover a new, equally beautiful truth that has never been known before. Maybe every scientist is Moses; certainly there are some who think they are.

Moses Complex aside, I believe that fiction tugs on authors for the same reason that science lures scientists. And in some ways, the fields serve the same purpose. I know, I know – pipe down, you scientists. It’s true. Good literature should put us inside thoughts and situations we haven’t imagined and provide perspectives that reality doesn’t afford. In doing so, it should reveal its own beautiful truths. Why? Because we don’t always see truth through truth; sometimes it takes fiction to make us understand.

%d bloggers like this: