Looking Schizophrenia in the Eye

272994276_3c83654e97_bMore than a century ago, scientists discovered something usual about how people with schizophrenia move their eyes. The men, psychologist and inventor Raymond Dodge and psychiatrist Allen Diefendorf, were trying out one of Dodge’s inventions: an early incarnation of the modern eye tracker. When they used it on psychiatric patients, they found that most of their subjects with schizophrenia had a funny way of following a moving object with their eyes.

When a healthy person watches a smoothly moving object (say, an airplane crossing the sky), she tracks the plane with a smooth, continuous eye movement to match its displacement. This action is called smooth pursuit. But smooth pursuit isn’t smooth for most patients with schizophrenia. Their eyes often fall behind and they make a series of quick, tiny jerks to catch up or even dart ahead of their target. For the better part of a century, this movement pattern would remain a mystery. But in recent decades, scientific discoveries have led to a better understanding of smooth pursuit eye movements – both in health and in disease.

Scientists now know that smooth pursuit involves a lot more than simply moving your eyes. To illustrate, let’s say a sexy jogger catches your eye on the street. When you first see the runner, your eyes are stationary and his or her image is moving across your retinas at some relatively constant rate. Your visual system (in particular, your visual motion-processing area MT) must first determine this rate. Then your eyes can move to catch up with the target and match its speed. If you do this well, the jogger’s image will no longer be moving relative to your retinas. From your visual system’s perspective, the jogger is running in place and his or her surroundings are moving instead. From both visual cues and signals about your eye movements, your brain can predict where the jogger is headed and keep moving your eyes at just the right speed to keep pace.

Although the smooth pursuit abnormalities in schizophrenia may sound like a movement problem, they appear to reflect a problem with perception. Sensitive visual tests show that motion perception is disrupted in many patients. They can’t tell the difference between the speeds of two objects or integrate complex motion information as well as healthy controls. A functional MRI study helped explain why. The study found that people with schizophrenia activated their motion-processing area MT less than controls while doing motion-processing tasks. The next logical question – why MT doesn’t work as well for patients – remains unanswered for now.

In my last two posts I wrote about how delusions can develop in healthy people who don’t suffer from psychosis. The same is true of not-so-smooth pursuit. In particular, healthy relatives of patients with schizophrenia tend to have jerkier pursuit movements than subjects without a family history of the illness. They are also impaired at some of the same motion-processing tests that stymie patients. This pattern, along with the results of twin studies, suggests that smooth pursuit dysfunction is inherited. Following up on this idea, two studies have compared subjects’ genotypes with the inheritance patterns of smooth pursuit problems within families. While they couldn’t identify exactly which gene was involved (a limitation of the technique), they both tracked the culprit gene to the same genetic neighborhood on the sixth chromosome.

Despite this progress, the tale of smooth pursuit in schizophrenia is more complex than it appears. For one, there’s evidence that smooth pursuit problems differ for patients with different forms of the disorder. Patients with negative symptoms (like social withdrawal or no outward signs of emotion) may have problems with the first step of smooth pursuit: judging the target’s speed and moving their eyes to catch up. Meanwhile, those with more positive symptoms (like delusions or hallucinations) may have more trouble with the second step: predicting the future movement of the target and keeping pace with their eyes.

It’s also unclear exactly how common these problems are among patients; depending on the study, as many as 95% or as few as 12% of patients may have disrupted smooth pursuit. The studies that found the highest rates of smooth pursuit dysfunction in patients also found rates as high as 19% for the problems among healthy controls. These differences may boil down to the details of how the eye movements were measured in the different experiments. Still, the studies all agreed that people with schizophrenia are far more likely to have smooth pursuit problems than healthy controls. What the studies don’t agree on is how specific these problems are to schizophrenia compared with other psychiatric illnesses. Some studies have found smooth pursuit abnormalities in patients with bipolar disorder and major depression as well as in their close relatives; other studies have not.

Despite these messy issues, a group of scientists at the University of Aberdeen in Scotland recently tried to tell whether subjects had schizophrenia based on their eye movements alone. In addition to smooth pursuit, they used two other measures: the subject’s ability to fix her gaze on a stable target and how she looked at pictures of complex scenes. Most patients have trouble holding their eyes still in the presence of distractors and, when shown a meaningful picture, they tend to look at fewer objects or features in the scene.

Taking the results from all three measures into account, the group could distinguish between a new set of patients with schizophrenia and new healthy controls with an accuracy of 87.8%. While this rate is high, keep in mind that the scientists removed real-world messiness by selecting controls without other psychiatric illnesses or close relatives with psychosis. This makes their demonstration a lot less impressive – and a lot less useful in the real world. I don’t think this method will ever become a viable alternative to diagnosing schizophrenia based on their clinical symptoms, but the approach may hold promise in a similar vein: identifying young people who are at risk for developing the illness. Finding these individuals and helping them sooner could truly mean the difference between life and death.

_____

Photo credit: Travis Nep Smith on Flickr, used via Creative Commons License

Benson PJ, Beedie SA, Shephard E, Giegling I, Rujescu D, & St Clair D (2012). Simple viewing tests can detect eye movement abnormalities that distinguish schizophrenia cases from controls with exceptional accuracy. Biological psychiatry, 72 (9), 716-24 PMID: 22621999

Genetics Post on DoubleXscience

I recently contributed a post to DoubleXScience, a site dedicated to all things women and science. The piece is called Armchair Genetics from Jamestown to Scott Brown and can be found here. It touches on children and race, assumptions, celebrities, a Cheerios ad, and the history of anti-miscegenation laws in the US (particularly relevant in light of the recent rulings on gay marriage). Please feel free to comment and share your own experiences or just let me know what you think!

Remains of the Plague

The history of science is littered with bones. Since antiquity, humans have studied the remains of the dead to understand the living. The practice is as common now as ever; only the methods have changed. In recent years, high-tech analyses of human remains have solved mysteries ranging from our ancestors’ prehistoric mating patterns to the cause of Beethoven’s death. The latest example of this morbid scientific tradition can be found in the e-pages of this month’s PLOS Pathogens. The colorful cast of characters includes European geneticists, a handful of teeth, a 6th century plague, and the US Department of Homeland Security.

Although the word plague is often used as a synonym for disease, plague actually refers to a particular type of illness caused by the bacterium Yersinia pestis. Rampant infection by Y. pestis was responsible for a recent pandemic in the 19th to 20th centuries. Before that it caused the 14th to 17th century pandemic that included the epidemic known as the Black Death.

Yet the pestilence of pestis may have swept across human populations long before the Black Death. According to historical records, a terrible pandemic killed people from Asia to Africa to Europe between the 6th and 8th centuries. It struck the Roman Empire under the watch of Emperor Justinian I, who contracted the disease himself but survived. The pandemic now bears his name: the Justinianic Plague. But was Justinian’s malady really a plague or has history pinned the blame on the wrong bacterium? A group of researchers in Munich decided to find out.

How?

By digging up ancient graves, of course. And helping themselves to some teeth.

The ancient graves were in an Early Medieval cemetery called Aschheim in the German state of Bavaria. The site was a strange choice; the authors reveal in their paper that the historical record shows no evidence that the Justinianic Plague reached Bavaria. However, the site was conveniently located within driving distance of most of the study’s authors. (It’s always easiest to do your gravedigging closer to home.) The authors did have solid evidence that the graves were from the 6th century and that each grave contained two or more bodies (a common burial practice during deadly epidemics). In total, the group dug up 12 graves and collected teeth from 19 bodies.

The scientists took the teeth back to their labs and tested them for a stretch of DNA unique to Y. pestis. Their logic: if the individuals died from infection by Y. pestis, their remains should contain ample DNA from the bacteria. Of course, some of this DNA would have deteriorated over the course of 1.5 millennia. The scientists would have to make do with what they found. They used three different methods to amplify and detect the bacterial DNA, however they only found a reliably large amount of it in the teeth of one individual, a body they affectionately nicknamed A120. They genotyped the Y. pestis DNA found in A120 to see how the bacterial strain compared with other versions of the bacterium (including those that caused the Black Death and the 19th-20th century plague pandemic.) The analysis showed that the Justinianic strain was an evolutionary precursor to the strain that caused the Black Death. Like the strains that sparked the second and third pandemics, this strain bore the genetic hallmarks of Y. pestis from Asia, suggesting that all three plague pandemics spread from the East.

The authors write that they have solved their historical mystery.

“These findings confirm that Y. pestis was the causative agent of the Justinianic Plague and should end the controversy over the etiological agent of the first plague pandemic.”

Ordinarily, the discussion sections of scientific papers are littered with qualifiers and terms like might be and suggestive. Not so here, even though the authors’ conclusion explains a phenomenon that killed many millions of people worldwide based on data from the decomposing remains of a single person who lived in a region that historians haven’t connected with the pandemic. In most branches of science, sweeping conclusions can only be made based on large and meticulously selected samples. In genetics, such rules can be swept aside. It is its own kind of magic. If you know how to read the code of life, you can peer into the distant past and divine real answers based on a handful of ancient teeth.

As it turns out, the study’s result is more than a cool addition to our knowledge of the Early Middle Ages. Plague would make a terrible weapon in the hands of a modern bioterrorist. That’s why the US Department of Homeland Security is listed as one of the funding sources for this study. So the next time you hear about your tax dollars hard at work, think of Bavarian graves, ancient teeth, and poor old A120.

_____

Photo credit: Dallas Krentzel

ResearchBlogging.org

Harbeck M, Seifert L, Hansch S, Wagner DM, Birdsell D, Parise KL, Wiechmann I, Grupe G, Thomas A, Keim P, Zoller L, Bramanti B, Riehm JM, Scholz HC (2013). Yersinia Pestis DNA from Skeletal Remains from the 6th Century Reveals Insights Into Justiniac Plague PLOS Pathogens DOI: 10.1371/journal.ppat.1003349

Locked Away

The results are in. The ultrasound was conclusive. And despite my previously described hunch that our growing baby is a boy, she turned out to be a girl. We are, of course, ecstatic. A healthy baby and a girl to boot! As everyone tells us, girls are simply more fun.

As I was reading in my pregnancy book the other day, I came across an interesting bit of trivia about baby girls. At this point in my pregnancy (nearly 6 months in), our baby’s ovaries contain all the eggs she’ll have for her entire life. As I mentioned in a prior post, the fact that a female fetus develops her lifetime supply of eggs in utero represents a remarkable transgenerational link. In essence, half of the genetic material that makes up my growing baby already existed inside my mother when she was pregnant. And now, inside me, exists half of the genetic material that will become all of the grandchildren I will ever have. This is the kind of link that seems to mix science and spirituality, that reminds us that, though we are a mere cluster of cells, there’s a poetry to the language of biology and Life.

But after stumbling upon this factoid about our baby’s eggs, I was also struck by a sense that somewhere someone seemed to have his or her priorities mixed up. If our baby were born today, she would have a slim chance of surviving. Her intestines, cerebral blood vessels, and retinas are immature and not ready for life outside the womb. Worse still, the only shot her lungs would have at functioning is with the aid of extreme medical intervention. The order of it all seems crazy. My baby is equipped with everything she’ll need to reproduce decades in the future, yet she lacks the lung development to make it five minutes in the outside world. What was biology thinking?

Then I remembered two delightful popular science books I’d read recently, The Red Queen by Matt Ridley and Life Ascending by Nick Lane. Both described the Red Queen Hypothesis of the evolution of sex, which states that the reason so much of the animal kingdom reproduces sexually (rather than just making clones of itself) is to ‘outwit’ parasites. In short, if each generation of humans were the same as the next, parasites large and microbial could evolve to overtake us. By mixing up our genetic makeup through sexual reproduction, we make it harder for illnesses to wipe us out. Like the Red Queen from Lewis Carroll’s classic, we keep running in order to stay in the same place (which is one step ahead of parasites and disease).

Just as there are parasitic organisms and bacteria, one might say that there are parasitic genes. For example, mutations in the DNA of our own replicating cells can cause cancer, which is essentially a self-made, genetic parasite. Moreover, retroviruses like HIV are essentially bits of genetic material that invade our bodies and can insert themselves into the DNA of our cells. And the ultimate road to immortality for a parasitic gene would be to hitch a ride on the back of reproduction. Imagine what an easy life that would be! If a retrovirus could invade the eggs in the ovaries, it would be passed on from one generation to the next without doing one iota of work. It’s the holy grail of parasitic invasion – get thee to the ovaries! According to Matt Ridley in another of his books, The Origins of Virtue, the human germ line is segregated from the rest of the growing embryo by 56 days after fertilization. Within two months of conception, the cells that will give rise to all of the embryo’s eggs (or sperm, in males) are already cordoned off. They are kept safe until they are needed many years in the future.

So perhaps my little baby’s development isn’t as backwards as it seemed at first. Yes, lungs are important. But when you’ve got something of value to others, it makes practical sense to hurry up and lock it away.

How the Giraffe Didn’t Get His Long Neck

iStock_000009818096XSmallIt’s the early 19th century, before Darwin’s Origin of Species. Before Mendel’s peas and Watson and Crick’s double helix. Scientists are struggling with the big questions of inheritance and reproduction without the aid of modern scientific methods. In this vacuum of concrete information, odd theories gained traction – some based on racial or social agendas, others on intuition or supposition.

Lamarckism, or soft inheritance, was one of the more pervasive of these ideas. According to the theory, organisms can inherit acquired traits. In the days before Darwin’s evolutionary theory, Lamarckism helped explain why organisms were so well adapted to their environments. Take the example of the giraffe’s long neck. A giraffe of yore (when giraffes had shorter necks) had to stretch its neck to reach the luscious leaves further up on tree branches. All that stretching lengthened its neck a little, and this longer neck was passed on to its offspring, who in turn stretched their necks and sired offspring who could reach even higher and munch the choicest leaves. It went on like this until giraffes were tall enough that they didn’t have to strain to reach leaves anymore.

It was a neat explanation that appealed to many 19th century scientists; even Darwin occasionally made use of it. But the theory had a nasty side as well. People applied it to humans and used it to explain differences between races or socioeconomic classes, calling the phenomenon degeneration. The mental and physical effects of years spent boozing and behaving badly would be passed down from father to son to grandson, each successively worse than his predecessor as the collective sum of each reckless lifetime added up. There was a technical term for the poor souls who wound up literally inheriting the sins of their fathers: degenerates. Certain scientists (or pseudoscientists) of the era, such as Benedict Morel and Cesare Lombroso, used the ideas of soft inheritance and degeneration to explain how violence, poverty, and criminality were heritable and could be categorized and studied.

Lamarckism, in the hands of Morel and others, offered a credible explanation of why the son of an alcoholic was more likely to be an alcoholic himself. But it did so by implying that the poor, the miserable, the suffering were inherently inferior to those with better, healthier (and probably wealthier) lifestyles. The poor were genetically degenerate, and they had no one to blame but themselves.

Thank god, thank god, Lamarckism and its corollary, degeneration, were debunked. By the 20th century, scientists knew that inheritance didn’t work that way. Our genetic information isn’t changed by what we do during our lifetimes. Besides, our sex cells are segregated from the other cells in our bodies. We don’t descend from our mothers, subject to all the stresses, strains, and yes, even boozing that their brains and bodies may have experienced. Instead, we descend from their ovaries. And thankfully, those things are well protected.

Only there’s a catch. In the last few decades, we’ve learned that while Lamarckism isn’t correct, it isn’t entirely wrong either. We’ve learned this through the field of epigenetics (literally, above genetics). This burgeoning field has helped us understand why the causes of so many heritable diseases still elude us, nearly a decade after we sequenced the human genome. Epigenetics adds untold complexity to an already complex genome. Some of its mechanisms are transient, others last a lifetime, but they all regulate gene expression and are necessary for normal growth and development. Thanks to them, females inactivate one of their X chromosomes (so women don’t get a double dose of proteins from that set of genes). Epigenetic mechanisms also oversee cellular differentiation, the process by which embryonic cells containing identical genetic material become skin cells, hepatocytes, neurons, and every other diverse cell type in the human body.

It now appears that epigenetic factors play an enormous role in human health. And what we do in our lives, the choices we make, affect our epigenome. Exposure to chemicals, stressors, or dietary changes can cause long-lasting tags to sit on our DNA or chromatin, controlling which genes are read and transcribed into proteins. For example, chronic cocaine use causes lasting epigenetic changes in the nucleus accumbens, a brain area linked to addiction. These changes boost plasticity and drug-related gene expression, which in turn probably contribute to the reinforcing effects of the drug.

But that’s not all. Epigenetic effects can span generations. No, the hardships of your parents’ lifetimes aren’t literally passed on to you in a cumulative fashion, giving you that longer neck or boozier disposition that Lamarckism might predict. Nonetheless, what your parents (and even your grandmother) did before you were born can be affecting your epigenome today.

It’s pretty wild stuff. Even if you’ve never met your maternal grandmother, even if she died long before your birth, her experiences and behavior could be affecting your health. First of all, the prenatal environment your mother experienced can have epigenetic effects on her that then propagate on to the next generation (you). Moreover, all the eggs a female will ever make have already formed in her ovaries by the time she’s born. They may not be mature, but they are there, DNA and all. I think that’s a pretty amazing transgenerational link. It means that half the strands of DNA that wound up becoming you were initially made inside your grandmother’s body. As science reveals the power of the prenatal environment, evidence is mounting that even what your grandmother ate during your mother’s gestational period and whether she suffered hardships like famine can alter your own risk for heart disease or diabetes.

Luckily, epigenetic gene regulation is softer and less absolute than its cousin Lamarckism. It is reversible and it can’t accumulate, generation upon generation, to create a degenerate class. The science of today is more humane than the old guys predicted, but it doesn’t let us off the hook. Epigenetics should remind us that we must be thoughtful in how we live. Our choices matter, for ourselves and for our offspring. We don’t yet understand how epigenetic mechanisms control our health and longevity, but that isn’t stopping our bodies from making us pay for what we do now.