Good Morning, Sleepyhead

A few weeks ago, I passed out. One moment I was standing by the door to our apartment, wishing my departing husband a good day at work. The next, my eyes had rolled back in my head and I fell face-first into the wall. My forehead struck the lower hinges of the door; I bruised my cheek and arm and knee, nothing badly. My husband, who was halfway out the door when I fell, rushed to gather me up. He held me and said, “Are you all right? Are you okay?” And that was how I awoke, as if from a long dreamless sleep, on the floor beside our front door.

I was only out a few seconds, but it felt like it could have been hours. I remembered the minutes leading up to my dramatic tumble, but they felt like long ago. A bit ethereal, and separated from the present by a gap that didn’t feel odd to me in the slightest.

I’ve always tended toward low blood pressure and often felt dizzy when standing up. After the fall, doctors checked me out and said I was fine. (My prescriptions are to drink more water and maybe eat more salt.) Still, the experience got me thinking about memory and how it’s a strange and elusive creature. How we always think we’ve caught it but we never have.

Back in my grad school days, we studied the case of H.M., the famous amnesic patient who was unable to form new memories. We learned that his journal was filled with descriptions of waking up as if for the first time and having no recollection of writing any of the prior journal entries, nor of how he came to be where he was. I wonder if the feeling was something like my contradictory experience on the floor, when I lacked memory of the preceding moments and yet felt as if nothing were missing. Time felt continuous, despite the fact that my memory was not.

The experience also reminded me of a dramatic story I read in the nonfiction book Soul Made Flesh. In 1650, a young British servant named Anne Green was seduced by her master’s grandson and gave birth to a stillborn baby. Thanks to the social mores of the time, she was tried and convicted of infanticide and sentenced to death. She proclaimed her innocence to the crowd that gathered in the courtyard of Oxford Castle to watch her hanging. After her speech, the executioner kicked the ladder out from under her and she hanged for almost half an hour before they cut her down and sent her body down the street to be dissected for science. Her designated dissectors were Drs. William Petty and Thomas Willis (of the Circle of Willis). But when they opened the coffin, they heard a rattle in her throat and managed to revive her with water, heat, and herbs.

When Anne Green came to, she began reciting the speech she’d delivered at the gallows. She didn’t remember leaving the prison, climbing the ladder, or giving the speech, much less (thankfully) hanging. A pamphlet later circulated about the event described her memory as “a clock whose weights had been taken off a while and afterward hung on again.” The incident illustrated the machine-like quality of memory. Today we describe it as flipping a switch. Anne Green’s memory had been turned off and then turned on again.

As strange as the stories of H.M. and Anne Green sound, their wild memory lapses aren’t so different from what happens to us everyday. We all experience time as continuous and ongoing, even though our memory is often shot through with holes. We spend a full third of our lives in unconscious slumber and remember little of our dreams. Even our waking lives are terribly preserved in the vault of our memory. How many of your breakfasts can you recall? How many birthday parties and drives to work? How many classroom lectures and airplane rides and showers can you individually call to mind?

Our recollections are mere fragments. They pepper the timeline of our past just enough to form a narrative – one’s life story. This story may feel solid and unbroken, but don’t kid yourself. Your memory is not. We are all amnesic, all a little untethered from the passing moments of our lives. We are continually rediscovering and resurrecting our past to move forward in the present. In one way or another, we have all roused from our coffin reciting a speech from the gallows or come to on the floor with a sore face and an astonished husband. We are all perpetually in the process of waking up for the very first time.

How the Giraffe Didn’t Get His Long Neck

iStock_000009818096XSmallIt’s the early 19th century, before Darwin’s Origin of Species. Before Mendel’s peas and Watson and Crick’s double helix. Scientists are struggling with the big questions of inheritance and reproduction without the aid of modern scientific methods. In this vacuum of concrete information, odd theories gained traction – some based on racial or social agendas, others on intuition or supposition.

Lamarckism, or soft inheritance, was one of the more pervasive of these ideas. According to the theory, organisms can inherit acquired traits. In the days before Darwin’s evolutionary theory, Lamarckism helped explain why organisms were so well adapted to their environments. Take the example of the giraffe’s long neck. A giraffe of yore (when giraffes had shorter necks) had to stretch its neck to reach the luscious leaves further up on tree branches. All that stretching lengthened its neck a little, and this longer neck was passed on to its offspring, who in turn stretched their necks and sired offspring who could reach even higher and munch the choicest leaves. It went on like this until giraffes were tall enough that they didn’t have to strain to reach leaves anymore.

It was a neat explanation that appealed to many 19th century scientists; even Darwin occasionally made use of it. But the theory had a nasty side as well. People applied it to humans and used it to explain differences between races or socioeconomic classes, calling the phenomenon degeneration. The mental and physical effects of years spent boozing and behaving badly would be passed down from father to son to grandson, each successively worse than his predecessor as the collective sum of each reckless lifetime added up. There was a technical term for the poor souls who wound up literally inheriting the sins of their fathers: degenerates. Certain scientists (or pseudoscientists) of the era, such as Benedict Morel and Cesare Lombroso, used the ideas of soft inheritance and degeneration to explain how violence, poverty, and criminality were heritable and could be categorized and studied.

Lamarckism, in the hands of Morel and others, offered a credible explanation of why the son of an alcoholic was more likely to be an alcoholic himself. But it did so by implying that the poor, the miserable, the suffering were inherently inferior to those with better, healthier (and probably wealthier) lifestyles. The poor were genetically degenerate, and they had no one to blame but themselves.

Thank god, thank god, Lamarckism and its corollary, degeneration, were debunked. By the 20th century, scientists knew that inheritance didn’t work that way. Our genetic information isn’t changed by what we do during our lifetimes. Besides, our sex cells are segregated from the other cells in our bodies. We don’t descend from our mothers, subject to all the stresses, strains, and yes, even boozing that their brains and bodies may have experienced. Instead, we descend from their ovaries. And thankfully, those things are well protected.

Only there’s a catch. In the last few decades, we’ve learned that while Lamarckism isn’t correct, it isn’t entirely wrong either. We’ve learned this through the field of epigenetics (literally, above genetics). This burgeoning field has helped us understand why the causes of so many heritable diseases still elude us, nearly a decade after we sequenced the human genome. Epigenetics adds untold complexity to an already complex genome. Some of its mechanisms are transient, others last a lifetime, but they all regulate gene expression and are necessary for normal growth and development. Thanks to them, females inactivate one of their X chromosomes (so women don’t get a double dose of proteins from that set of genes). Epigenetic mechanisms also oversee cellular differentiation, the process by which embryonic cells containing identical genetic material become skin cells, hepatocytes, neurons, and every other diverse cell type in the human body.

It now appears that epigenetic factors play an enormous role in human health. And what we do in our lives, the choices we make, affect our epigenome. Exposure to chemicals, stressors, or dietary changes can cause long-lasting tags to sit on our DNA or chromatin, controlling which genes are read and transcribed into proteins. For example, chronic cocaine use causes lasting epigenetic changes in the nucleus accumbens, a brain area linked to addiction. These changes boost plasticity and drug-related gene expression, which in turn probably contribute to the reinforcing effects of the drug.

But that’s not all. Epigenetic effects can span generations. No, the hardships of your parents’ lifetimes aren’t literally passed on to you in a cumulative fashion, giving you that longer neck or boozier disposition that Lamarckism might predict. Nonetheless, what your parents (and even your grandmother) did before you were born can be affecting your epigenome today.

It’s pretty wild stuff. Even if you’ve never met your maternal grandmother, even if she died long before your birth, her experiences and behavior could be affecting your health. First of all, the prenatal environment your mother experienced can have epigenetic effects on her that then propagate on to the next generation (you). Moreover, all the eggs a female will ever make have already formed in her ovaries by the time she’s born. They may not be mature, but they are there, DNA and all. I think that’s a pretty amazing transgenerational link. It means that half the strands of DNA that wound up becoming you were initially made inside your grandmother’s body. As science reveals the power of the prenatal environment, evidence is mounting that even what your grandmother ate during your mother’s gestational period and whether she suffered hardships like famine can alter your own risk for heart disease or diabetes.

Luckily, epigenetic gene regulation is softer and less absolute than its cousin Lamarckism. It is reversible and it can’t accumulate, generation upon generation, to create a degenerate class. The science of today is more humane than the old guys predicted, but it doesn’t let us off the hook. Epigenetics should remind us that we must be thoughtful in how we live. Our choices matter, for ourselves and for our offspring. We don’t yet understand how epigenetic mechanisms control our health and longevity, but that isn’t stopping our bodies from making us pay for what we do now.

Who Am I Again?

5370003348_16454204e6_z

Thanks to a recent kerfuffle over the Earth’s precession and its effect on our astrological signs, many people have spent this week questioning their personality traits. I went from being a life-long Gemini (changeable, duplicitous) to a possible Taurus (stubborn, steady), neither of which I think describe me. I’ve never believed in astrological signs, but many people do, and this week must have been a confusing one for them.

The whole thing got me thinking about how we look outward for explanations and definitions of our inner selves. No one has a better vantage point than we do to observe our own personal thoughts, feelings, attitudes, and behaviors. How funny that we once looked to the stars in order to understand ourselves! Those of us who consider ourselves scientific and modern are no better. Although we scoff at sun signs and palm readings, increasingly we are turning to our brains and our DNA for answers that they simply can’t give.

In the 1800’s, the Phrenological Fowlers (later Fowlers and Wells) founded a nationwide industry on reading people’s personalities based on the bumps on their heads. They published extensively and sent emissaries to small towns throughout the U.S. so that, for a small fee, the masses might come to know themselves better. The company and its methods were an unrivaled success. America was obsessed with phrenology. Sometime in the 1860’s, a curious Mark Twain visited Fowler’s office under an assumed name. Fowler read his head and said that his skull dipped in at a particular point where it should have bulged out – a sure sign that Twain, the preeminent American humorist, utterly lacked a sense of humor.

Nowadays, many still look to their brains for answers. When I used to scan participants in fMRI experiments, they would often ask what I could tell them about their brains. I couldn’t tell them anything; all the analysis took place later, back at the lab. But as a frequent subject in pilot experiments for my own and colleagues’ studies, I’ve had unfettered access to data from my own brain. I know that I have a large and robust fusiform face area (a region thought to be critical for face recognition) and a rather dinky visual word form area (implicated in identifying letters of the alphabet.) What does that mean, when I am an avid reader and often embarrass myself with my poor ability to recognize faces?

While people still look to the stars and to brains (if not skulls) in order to understand themselves, the next big thing has arrived. The age of personal genomics is upon us and countless startups out there are eager to swap a check and a swab of our cells for a glimpse into our futures and ourselves. I have to admit, I fantasize sometimes about having my genome read. I would love the chance to pour through details about my ancestral line or learn what type of diseases I am predisposed to developing. But the biggest draw is to learn about myself. What forms of the anxiety genes do I have? What about genes linked to mental illness, intelligence, novelty-seeking? As a scientist, I know that complex traits are determined by a mixture of environment and numerous genes, many of which we haven’t yet discovered. Beyond that, epigenetic factors influence the expression of our genes in ways we don’t yet understand. Yet I still find myself wishing someone would hand me that printout with the secrets to myself.

The cognitive scientist Steven Pinker wrote a wonderful essay wading through his results when his own genome was sequenced. In it, he struggles with the discrepancies. His genome says he should be sensitive to bitter flavors, yet he enjoys beer, broccoli, and brussels sprouts. His genome says he has a high risk of baldness, yet he is known for his thick mane of overflowing, curly hair. Other results he believes or would like to believe. What is a person seeking direction and self-wisdom to do?

So at the end of this astrologically confusing week, I find myself at a loss. Why do we crave external guidance to help us understand our internal selves? It may be because we are less static and more changeable than we like to believe. As I alluded to in my post about our potential to do evil, psychology experiments (and history) have shown that human beings are heavily influenced by their circumstances. Because we are adaptable, we behave very differently depending on who we are with and what we are doing. Although the adaptability may be advantageous, I suspect it unsettles us. We want to believe we have a solid, stable identity, and we will look to mystics or scientists – anyone who can give us that assurance. I know who I am and who I always will be.

The hard (but in its own way beautiful) truth is that we are each a complex and contradictory landscape of traits, behaviors, and passions. Be wary of those who try to describe you with a handful of paltry adjectives. Know thyself. Or keep trying, anyway. It should take at least a lifetime.

Why Bigger Isn’t Always Better

One of my entertainments this holiday season was following the online buzz over a recent article in Nature Neuroscience. The authors’ findings were covered by Wired, Time, Slate, U.S. News & World Report, and the BBC, to name a few. One headline read: Scientists Discover Facebook Center of the Brain. Another: How to Win Friends: Have a Big Amygdala?

The authors of the Nature Neuroscience article report a correlation between the size of a subcortical brain structure called the amygdala and the extent of a person’s social network. In effect, people with larger amygdalas tended to have more friends and close acquaintances than those with lesser-sized amygdalas. The popular press and the public leapt on this idea. We are predestined by our anatomy to be popular or not. If we were alone on New Years Eve, if our Facebook friend count is low, it’s not our fault. Chalk that one off to our brains, our genes, our parents.

All of this struck me as both amusing and sad because of a book I was reading at the time. The book, Postcards from the Brain Museum by Brian Burrell, chronicles the history of neuroscience in the context of our search for greatness (as well as criminality, idiocy, and inferiority.) It tells how scientists spent most of the 19th century collecting human brains from geniuses, criminals, and the poor to try to understand why some people demonstrate remarkable abilities while others flounder and fail.

It is a sad and sordid history. At first, some believed that the sheer size or weight of one’s brain predicted greatness, so that large brains were capable of better thinking. Since women’s brains (like the rest of their bodies) were on average smaller than those of their male counterparts, this provided a perfect explanation for their intellectual inferiority. Later, when the link between brain volume and intelligence was debunked, scientists suggested that the amount of folding on the brain’s surface was the marker of a brilliant brain. The more convolutions on the surface, the smarter the individual. Other scientists identified specific fissures that they deemed inferior, as they were supposedly found more often in apes and women. These lines of research would be used to justify racial and gender stereotypes and give rise to the practice of eugenics in the first half of the 20th century.

The peer review process and established statistical methods ensure that today’s science is more legitimate than it was in centuries past. But neuroimaging has allowed us to probe the living brain to a degree heretofore unimagined. With it, scientists amass enormous amounts of data that strain our standard statistical techniques and challenge our ability to distinguish between profound, universal discoveries and those idiosyncratic to our subject sample or functionally irrelevant. We still don’t know whether ‘bigger is better’ nor understand the source or functional consequences of individual differences in the size and shape of brain regions. Certainly we don’t know enough to look at a person’s brain and guess with accuracy how smart they are, how good they are, or, yes, even how many friends they have.

Just look at this graph from the Nature Neuroscience paper plotting amygdala volume on the horizontal axis and social network size on the vertical axis:

The figure above shows each subject as a black dot (for younger participants) or a gray triangle (for older ones). The diagonal line shows a mathematical correlation between amygdala volume and social network size, but look at how many dots and triangles lie away from the line. For the same amygdala volume (say, 3 cubic mm), there are dots that lie far above the line and others that lie far below it. No one looking at this figure can say that amygdala size determines one’s sociability. Perhaps it plays some small role, sure. But we are not slaves to our amygdala volumes, just like we’re not slaves to our overall brain size, our fissural patterns or cerebral convolutions. Our abilities and thoughts do come from our brains, but we have to keep in mind that those brains are far more complex than we can fathom. Who you are can never be reduced to a list of measured volumes. It’s important that we remember that, and that we never return to those days of ‘mine’s bigger than yours.’

Me, You, and Lucifer

prison_bars1

Are we all capable of doing truly evil things? This was a question posed by my latest nonfiction read, The Lucifer Effect. The author, Philip Zimbardo, is the psychologist who created the infamous Stanford Prison Experiment in 1971.

For his prison experiment, Zimbardo randomly assigned young college men to play the roles of prisoners or guards in a mock prison set up in the basement of the Stanford Psychology building. The study took place in an era of anti-war protests, when college students were being arrested and thrown into jail. Zimbardo conceived the experiment with the purpose of examining the prisoners’ mentalities and their attempts to organize and rebel. However ultimately, the most fascinating aspect of the experiment was the behavior of the subjects who played guards. Well-behaved, emotionally stable college boys became cruel guards. They inflicted shocking degradations on the prisoners, causing more than one of their detainees to suffer an emotional breakdown.

This unethical but fascinating experiment, and the other psychology studies and world events that Zimbardo chronicles in his book, demonstrate how we are all actors. We conform easily to the roles in which we are cast, even if these roles involve harming others or allowing others to harm us. And yet we cling to the concept of our unique and personal identity, particularly in Western cultures where individualism is highly prized.

This human failing, our malleability to fit social norms, is a consequence of one of our greatest attributes, our resourcefulness. We have evolved to be successful in complex environments and to tailor our behavior to the people and circumstances that surround us. Think about how you behave when you attend a party where you know no one compared with a night out with your oldest friends. Think of how you conduct yourself at a job interview, at a sporting event, babysitting a child, or when you’re alone. We manage to be very different people from moment to moment. We have to be; our complex social world demands it. So why is it so hard to imagine that, when plunged into an extreme role under extreme circumstances, we might do something we’d never think we’re capable of? Something truly inhumane?

In his book, Zimbardo argues that the suicide bombers of modern religious extremism, the torturers in the Abu Ghraib prison, and the executioners of the Holocaust were normal people subjected to extreme pressures and circumstances. There is a long history of psychology experiments, including Zimbardo’s study and the infamous Milgram experiments, that demonstrate how stable, well-meaning Americans will commit terrible acts when influenced by authority or anonymity. And past events have shown time and again that we are capable of standing by rather than intervening to help those in need. The case of Kitty Genovese is a dramatic example of this common occurrence, the basis for the so-called bystander effect. Zimbardo calls it the evil of inaction. But should it come as a surprise? Doesn’t society teach us to mind our own business?

These questions about complicity and inaction reminded me of a night from my childhood. I was seven years old when my parents first took me to the Chicago Symphony Orchestra. I remember how people with shiny shoes, lush coats, and clutch purses poured in from nearby parking structures to gather outside Orchestra Hall. Amid the symphony goers, I noticed a crumpled man sitting on the ground with his head resting on his knees. I remember that his jacket was far too thin for the biting cold and that the plastic cup by his feet was empty. All of the well-dressed concert goers swept past him, including us. I felt ill throughout the first half of the performance, knowing I could have asked my parents for money and given him a dollar or two. I worried that he’d starve to death because I hadn’t done anything to help him. At intermission, I dragged my parents out to the sidewalk, but by then the man was gone.

As an adult, I’ve lived in big cities and have become accustomed to sharing sidewalks and street corners with the homeless. People could not function in urban settings if they became as paralyzed by the sight of suffering as my seven-year-old self was. In the case of homelessness, it’s hard to know what to do or how to help; giving money to people on the street is not necessarily the best thing. But I do think our reaction to homelessness offers clues to how experience and social norms allow us to ignore or overlook suffering. To live and function in a city, we have to curb our empathy and compassion. Even if we aren’t cruel, even if we mean well, we manage to turn off our humanity on a daily basis. And the fact that we can turn it off a little should serve as a reminder that we all have the capability to turn it off a lot.

There are things we can do to help address the problem of homelessness, such as volunteering or donating money to a local homeless shelter. Charitable donations and philanthropic acts improve our world and reduce suffering, of course, and we should do both – as often as we can. But we should also remember that good deeds don’t change our capacity for evil. The best way to prevent ourselves from being cruel or complicit in cruelty is to believe that we are capable of it. If we remember that we are susceptible to the pressures of authority, group norms, and social roles, we can be vigilant. We can stop ourselves. We can speak up. We can act.