Guessing at Sex

Something’s happened. Something both miraculous and mundane. Over the past few months I’ve been transformed from a woman into an incubator. A walking, talking (and often eating and napping) incubator programmed to provide the perfect environment for a growing baby . . . something. We’ll find out the gender in a couple weeks. Still, it’s always the first question people ask when they hear that I’m pregnant: “Is it a boy or a girl?” And since we haven’t had an answer for them, my husband and I have been showered with an astonishing number of guesses. It seems that everyone we’ve ever met is secretly a gender-divining expert.

They all have their methods. One woman had me turn around so she could size up my back fat. “If you gain weight in the back, it means you’re having a boy,” she explained. Another examined my face as she explained her theory that women who carry a girl look more beautiful (thanks to the added female hormones) while those carrying a boy start looking more, well, dude-like. Others have sworn by the shape of the belly – if the stomach looks pointed versus broad. One acquaintance asked for the baby’s fetal heart rate, saying that babies with faster heart rates always turn out to be girls. Another friend described her theory that the mother’s personality predicts the baby’s sex; apparently, soft-spoken mothers tend to have boys.

I like when people guess the gender. It’s interesting to hear their varied theories and sweet to think that they’re excited enough about our pregnancy to venture a guess. It makes a personal, biological experience more communal. But I can’t say much for their accuracy. So far, the guesses have been evenly split between boy and girl.

That’s the thing about guessing gender; with a 50-50 chance of either outcome, it’s unimpressive if you’re right and even more unimpressive if you’re wrong. And yet with such odds, it’s only natural that people start thinking they’ve hit on a good heuristic. No matter how wrong your method, you will, on average, be right 50% of the time. That already subjectively feels like a lot of rightness. If you try your method out on a small number of people to start, you could wind up with a lower success rate (by chance) and perhaps abandon your technique, but you might luck out and guess right 75% of the time or higher, at least for a little while. Someone who starts out on a lucky streak may well become a diehard believer who swears by his method, even after his batting average declines.

There’s simply no way that so many people can be so sure of their gender-guessing strategies unless they pick and choose their outcomes. Or unless, as I suspect, their memories do it for them. Consider the conundrum of the grocery store line. Many of us believe we are cursed (or mysteriously inept) at choosing a checkout lane at the grocery store. No matter which line we wind up in, it turns out to be the slowest. If we switch to another, that one mysteriously slows down. You rarely hear about the reverse – people who claim to have a special gift for picking the fastest lane. How can the majority of people be below average at the same task? If their memories are skewing the results. We never notice and remember the times we breeze right through checkout or overtake our neighbors in the next line over. The salient events – and the ones we’ll remember – are the times we’ve been stuck behind someone arguing prices or heaping coupons on the counter. Times when six people go by in the next line over while your food wilts and thaws on the conveyor belt.

It must be the same with guessing gender. When people are right, they are ecstatic and vindicated. When they are wrong, they notice and remember it less. And those that do notice their error may wonder if they misjudged the belly shape or back fat. The problem wasn’t necessarily with the heuristic, but rather with its execution. If only the pregnant lady’s dress had been tighter or if the guesser hadn’t been distracted by hors d’oeuvres, the method would certainly have worked!

I am by no means immune to these twisted ways of thinking. I can’t help but believe that I’m cursed at picking grocery lines. And I also seem to have a guess about this baby’s gender. For no apparent reason, I have it in my head that the baby is a boy. No heuristic here, just a feeling I can’t seem to shake. It’s not that I’d prefer a boy – I’d be equally delighted to have a girl. And I know that there’s no scientific merit to the inkling. Even if a woman could tune into some subtle something in her body and know, she’d need prior experience to compare it to. This being my first pregnancy, I have no idea what it might feel like to carry a boy versus a girl, if such a thing were even possible. So I should put no stock in such a feeling.

And yet when the ultrasound rolls around, I know I’ll be surprised if we learn that the baby’s a girl. Equally happy and excited, to be sure. But most definitely (and illogically) surprised.

Good Morning, Sleepyhead

A few weeks ago, I passed out. One moment I was standing by the door to our apartment, wishing my departing husband a good day at work. The next, my eyes had rolled back in my head and I fell face-first into the wall. My forehead struck the lower hinges of the door; I bruised my cheek and arm and knee, nothing badly. My husband, who was halfway out the door when I fell, rushed to gather me up. He held me and said, “Are you all right? Are you okay?” And that was how I awoke, as if from a long dreamless sleep, on the floor beside our front door.

I was only out a few seconds, but it felt like it could have been hours. I remembered the minutes leading up to my dramatic tumble, but they felt like long ago. A bit ethereal, and separated from the present by a gap that didn’t feel odd to me in the slightest.

I’ve always tended toward low blood pressure and often felt dizzy when standing up. After the fall, doctors checked me out and said I was fine. (My prescriptions are to drink more water and maybe eat more salt.) Still, the experience got me thinking about memory and how it’s a strange and elusive creature. How we always think we’ve caught it but we never have.

Back in my grad school days, we studied the case of H.M., the famous amnesic patient who was unable to form new memories. We learned that his journal was filled with descriptions of waking up as if for the first time and having no recollection of writing any of the prior journal entries, nor of how he came to be where he was. I wonder if the feeling was something like my contradictory experience on the floor, when I lacked memory of the preceding moments and yet felt as if nothing were missing. Time felt continuous, despite the fact that my memory was not.

The experience also reminded me of a dramatic story I read in the nonfiction book Soul Made Flesh. In 1650, a young British servant named Anne Green was seduced by her master’s grandson and gave birth to a stillborn baby. Thanks to the social mores of the time, she was tried and convicted of infanticide and sentenced to death. She proclaimed her innocence to the crowd that gathered in the courtyard of Oxford Castle to watch her hanging. After her speech, the executioner kicked the ladder out from under her and she hanged for almost half an hour before they cut her down and sent her body down the street to be dissected for science. Her designated dissectors were Drs. William Petty and Thomas Willis (of the Circle of Willis). But when they opened the coffin, they heard a rattle in her throat and managed to revive her with water, heat, and herbs.

When Anne Green came to, she began reciting the speech she’d delivered at the gallows. She didn’t remember leaving the prison, climbing the ladder, or giving the speech, much less (thankfully) hanging. A pamphlet later circulated about the event described her memory as “a clock whose weights had been taken off a while and afterward hung on again.” The incident illustrated the machine-like quality of memory. Today we describe it as flipping a switch. Anne Green’s memory had been turned off and then turned on again.

As strange as the stories of H.M. and Anne Green sound, their wild memory lapses aren’t so different from what happens to us everyday. We all experience time as continuous and ongoing, even though our memory is often shot through with holes. We spend a full third of our lives in unconscious slumber and remember little of our dreams. Even our waking lives are terribly preserved in the vault of our memory. How many of your breakfasts can you recall? How many birthday parties and drives to work? How many classroom lectures and airplane rides and showers can you individually call to mind?

Our recollections are mere fragments. They pepper the timeline of our past just enough to form a narrative – one’s life story. This story may feel solid and unbroken, but don’t kid yourself. Your memory is not. We are all amnesic, all a little untethered from the passing moments of our lives. We are continually rediscovering and resurrecting our past to move forward in the present. In one way or another, we have all roused from our coffin reciting a speech from the gallows or come to on the floor with a sore face and an astonished husband. We are all perpetually in the process of waking up for the very first time.

How the Giraffe Didn’t Get His Long Neck

iStock_000009818096XSmallIt’s the early 19th century, before Darwin’s Origin of Species. Before Mendel’s peas and Watson and Crick’s double helix. Scientists are struggling with the big questions of inheritance and reproduction without the aid of modern scientific methods. In this vacuum of concrete information, odd theories gained traction – some based on racial or social agendas, others on intuition or supposition.

Lamarckism, or soft inheritance, was one of the more pervasive of these ideas. According to the theory, organisms can inherit acquired traits. In the days before Darwin’s evolutionary theory, Lamarckism helped explain why organisms were so well adapted to their environments. Take the example of the giraffe’s long neck. A giraffe of yore (when giraffes had shorter necks) had to stretch its neck to reach the luscious leaves further up on tree branches. All that stretching lengthened its neck a little, and this longer neck was passed on to its offspring, who in turn stretched their necks and sired offspring who could reach even higher and munch the choicest leaves. It went on like this until giraffes were tall enough that they didn’t have to strain to reach leaves anymore.

It was a neat explanation that appealed to many 19th century scientists; even Darwin occasionally made use of it. But the theory had a nasty side as well. People applied it to humans and used it to explain differences between races or socioeconomic classes, calling the phenomenon degeneration. The mental and physical effects of years spent boozing and behaving badly would be passed down from father to son to grandson, each successively worse than his predecessor as the collective sum of each reckless lifetime added up. There was a technical term for the poor souls who wound up literally inheriting the sins of their fathers: degenerates. Certain scientists (or pseudoscientists) of the era, such as Benedict Morel and Cesare Lombroso, used the ideas of soft inheritance and degeneration to explain how violence, poverty, and criminality were heritable and could be categorized and studied.

Lamarckism, in the hands of Morel and others, offered a credible explanation of why the son of an alcoholic was more likely to be an alcoholic himself. But it did so by implying that the poor, the miserable, the suffering were inherently inferior to those with better, healthier (and probably wealthier) lifestyles. The poor were genetically degenerate, and they had no one to blame but themselves.

Thank god, thank god, Lamarckism and its corollary, degeneration, were debunked. By the 20th century, scientists knew that inheritance didn’t work that way. Our genetic information isn’t changed by what we do during our lifetimes. Besides, our sex cells are segregated from the other cells in our bodies. We don’t descend from our mothers, subject to all the stresses, strains, and yes, even boozing that their brains and bodies may have experienced. Instead, we descend from their ovaries. And thankfully, those things are well protected.

Only there’s a catch. In the last few decades, we’ve learned that while Lamarckism isn’t correct, it isn’t entirely wrong either. We’ve learned this through the field of epigenetics (literally, above genetics). This burgeoning field has helped us understand why the causes of so many heritable diseases still elude us, nearly a decade after we sequenced the human genome. Epigenetics adds untold complexity to an already complex genome. Some of its mechanisms are transient, others last a lifetime, but they all regulate gene expression and are necessary for normal growth and development. Thanks to them, females inactivate one of their X chromosomes (so women don’t get a double dose of proteins from that set of genes). Epigenetic mechanisms also oversee cellular differentiation, the process by which embryonic cells containing identical genetic material become skin cells, hepatocytes, neurons, and every other diverse cell type in the human body.

It now appears that epigenetic factors play an enormous role in human health. And what we do in our lives, the choices we make, affect our epigenome. Exposure to chemicals, stressors, or dietary changes can cause long-lasting tags to sit on our DNA or chromatin, controlling which genes are read and transcribed into proteins. For example, chronic cocaine use causes lasting epigenetic changes in the nucleus accumbens, a brain area linked to addiction. These changes boost plasticity and drug-related gene expression, which in turn probably contribute to the reinforcing effects of the drug.

But that’s not all. Epigenetic effects can span generations. No, the hardships of your parents’ lifetimes aren’t literally passed on to you in a cumulative fashion, giving you that longer neck or boozier disposition that Lamarckism might predict. Nonetheless, what your parents (and even your grandmother) did before you were born can be affecting your epigenome today.

It’s pretty wild stuff. Even if you’ve never met your maternal grandmother, even if she died long before your birth, her experiences and behavior could be affecting your health. First of all, the prenatal environment your mother experienced can have epigenetic effects on her that then propagate on to the next generation (you). Moreover, all the eggs a female will ever make have already formed in her ovaries by the time she’s born. They may not be mature, but they are there, DNA and all. I think that’s a pretty amazing transgenerational link. It means that half the strands of DNA that wound up becoming you were initially made inside your grandmother’s body. As science reveals the power of the prenatal environment, evidence is mounting that even what your grandmother ate during your mother’s gestational period and whether she suffered hardships like famine can alter your own risk for heart disease or diabetes.

Luckily, epigenetic gene regulation is softer and less absolute than its cousin Lamarckism. It is reversible and it can’t accumulate, generation upon generation, to create a degenerate class. The science of today is more humane than the old guys predicted, but it doesn’t let us off the hook. Epigenetics should remind us that we must be thoughtful in how we live. Our choices matter, for ourselves and for our offspring. We don’t yet understand how epigenetic mechanisms control our health and longevity, but that isn’t stopping our bodies from making us pay for what we do now.

Who Am I Again?

5370003348_16454204e6_z

Thanks to a recent kerfuffle over the Earth’s precession and its effect on our astrological signs, many people have spent this week questioning their personality traits. I went from being a life-long Gemini (changeable, duplicitous) to a possible Taurus (stubborn, steady), neither of which I think describe me. I’ve never believed in astrological signs, but many people do, and this week must have been a confusing one for them.

The whole thing got me thinking about how we look outward for explanations and definitions of our inner selves. No one has a better vantage point than we do to observe our own personal thoughts, feelings, attitudes, and behaviors. How funny that we once looked to the stars in order to understand ourselves! Those of us who consider ourselves scientific and modern are no better. Although we scoff at sun signs and palm readings, increasingly we are turning to our brains and our DNA for answers that they simply can’t give.

In the 1800’s, the Phrenological Fowlers (later Fowlers and Wells) founded a nationwide industry on reading people’s personalities based on the bumps on their heads. They published extensively and sent emissaries to small towns throughout the U.S. so that, for a small fee, the masses might come to know themselves better. The company and its methods were an unrivaled success. America was obsessed with phrenology. Sometime in the 1860’s, a curious Mark Twain visited Fowler’s office under an assumed name. Fowler read his head and said that his skull dipped in at a particular point where it should have bulged out – a sure sign that Twain, the preeminent American humorist, utterly lacked a sense of humor.

Nowadays, many still look to their brains for answers. When I used to scan participants in fMRI experiments, they would often ask what I could tell them about their brains. I couldn’t tell them anything; all the analysis took place later, back at the lab. But as a frequent subject in pilot experiments for my own and colleagues’ studies, I’ve had unfettered access to data from my own brain. I know that I have a large and robust fusiform face area (a region thought to be critical for face recognition) and a rather dinky visual word form area (implicated in identifying letters of the alphabet.) What does that mean, when I am an avid reader and often embarrass myself with my poor ability to recognize faces?

While people still look to the stars and to brains (if not skulls) in order to understand themselves, the next big thing has arrived. The age of personal genomics is upon us and countless startups out there are eager to swap a check and a swab of our cells for a glimpse into our futures and ourselves. I have to admit, I fantasize sometimes about having my genome read. I would love the chance to pour through details about my ancestral line or learn what type of diseases I am predisposed to developing. But the biggest draw is to learn about myself. What forms of the anxiety genes do I have? What about genes linked to mental illness, intelligence, novelty-seeking? As a scientist, I know that complex traits are determined by a mixture of environment and numerous genes, many of which we haven’t yet discovered. Beyond that, epigenetic factors influence the expression of our genes in ways we don’t yet understand. Yet I still find myself wishing someone would hand me that printout with the secrets to myself.

The cognitive scientist Steven Pinker wrote a wonderful essay wading through his results when his own genome was sequenced. In it, he struggles with the discrepancies. His genome says he should be sensitive to bitter flavors, yet he enjoys beer, broccoli, and brussels sprouts. His genome says he has a high risk of baldness, yet he is known for his thick mane of overflowing, curly hair. Other results he believes or would like to believe. What is a person seeking direction and self-wisdom to do?

So at the end of this astrologically confusing week, I find myself at a loss. Why do we crave external guidance to help us understand our internal selves? It may be because we are less static and more changeable than we like to believe. As I alluded to in my post about our potential to do evil, psychology experiments (and history) have shown that human beings are heavily influenced by their circumstances. Because we are adaptable, we behave very differently depending on who we are with and what we are doing. Although the adaptability may be advantageous, I suspect it unsettles us. We want to believe we have a solid, stable identity, and we will look to mystics or scientists – anyone who can give us that assurance. I know who I am and who I always will be.

The hard (but in its own way beautiful) truth is that we are each a complex and contradictory landscape of traits, behaviors, and passions. Be wary of those who try to describe you with a handful of paltry adjectives. Know thyself. Or keep trying, anyway. It should take at least a lifetime.

Why Bigger Isn’t Always Better

One of my entertainments this holiday season was following the online buzz over a recent article in Nature Neuroscience. The authors’ findings were covered by Wired, Time, Slate, U.S. News & World Report, and the BBC, to name a few. One headline read: Scientists Discover Facebook Center of the Brain. Another: How to Win Friends: Have a Big Amygdala?

The authors of the Nature Neuroscience article report a correlation between the size of a subcortical brain structure called the amygdala and the extent of a person’s social network. In effect, people with larger amygdalas tended to have more friends and close acquaintances than those with lesser-sized amygdalas. The popular press and the public leapt on this idea. We are predestined by our anatomy to be popular or not. If we were alone on New Years Eve, if our Facebook friend count is low, it’s not our fault. Chalk that one off to our brains, our genes, our parents.

All of this struck me as both amusing and sad because of a book I was reading at the time. The book, Postcards from the Brain Museum by Brian Burrell, chronicles the history of neuroscience in the context of our search for greatness (as well as criminality, idiocy, and inferiority.) It tells how scientists spent most of the 19th century collecting human brains from geniuses, criminals, and the poor to try to understand why some people demonstrate remarkable abilities while others flounder and fail.

It is a sad and sordid history. At first, some believed that the sheer size or weight of one’s brain predicted greatness, so that large brains were capable of better thinking. Since women’s brains (like the rest of their bodies) were on average smaller than those of their male counterparts, this provided a perfect explanation for their intellectual inferiority. Later, when the link between brain volume and intelligence was debunked, scientists suggested that the amount of folding on the brain’s surface was the marker of a brilliant brain. The more convolutions on the surface, the smarter the individual. Other scientists identified specific fissures that they deemed inferior, as they were supposedly found more often in apes and women. These lines of research would be used to justify racial and gender stereotypes and give rise to the practice of eugenics in the first half of the 20th century.

The peer review process and established statistical methods ensure that today’s science is more legitimate than it was in centuries past. But neuroimaging has allowed us to probe the living brain to a degree heretofore unimagined. With it, scientists amass enormous amounts of data that strain our standard statistical techniques and challenge our ability to distinguish between profound, universal discoveries and those idiosyncratic to our subject sample or functionally irrelevant. We still don’t know whether ‘bigger is better’ nor understand the source or functional consequences of individual differences in the size and shape of brain regions. Certainly we don’t know enough to look at a person’s brain and guess with accuracy how smart they are, how good they are, or, yes, even how many friends they have.

Just look at this graph from the Nature Neuroscience paper plotting amygdala volume on the horizontal axis and social network size on the vertical axis:

The figure above shows each subject as a black dot (for younger participants) or a gray triangle (for older ones). The diagonal line shows a mathematical correlation between amygdala volume and social network size, but look at how many dots and triangles lie away from the line. For the same amygdala volume (say, 3 cubic mm), there are dots that lie far above the line and others that lie far below it. No one looking at this figure can say that amygdala size determines one’s sociability. Perhaps it plays some small role, sure. But we are not slaves to our amygdala volumes, just like we’re not slaves to our overall brain size, our fissural patterns or cerebral convolutions. Our abilities and thoughts do come from our brains, but we have to keep in mind that those brains are far more complex than we can fathom. Who you are can never be reduced to a list of measured volumes. It’s important that we remember that, and that we never return to those days of ‘mine’s bigger than yours.’