Perfect Pitch Redux

5819184201_df0392f0e7_b

I can just hear the advertisement now.

Do you have perfect pitch? Would you like to? Then Depakote might be right for you . . .

Perfect pitch is the ability to name or produce a musical note without a reference note. While most children presumably have the capacity to learn perfect pitch, only one in about ten thousand adults can actually do it. That’s because children must receive extensive musical training as youngsters to develop it. Most adults with perfect pitch began studying music at six years of age or younger. By the time children turn nine, their window to learn perfect pitch has already closed. They may yet blossom into wonderful musicians but they will never be able to count perfect pitch among their talents.

Or might they after all?

Well no, probably not. But a new study, published in Frontiers in Systems Neuroscience, has opened the door to such questions. Its authors tested how young men learned to name notes when they were on or off of a drug called valproate (brand name: Depakote). Valproate is widely used to treat epilepsy and bipolar disorder. It’s part of a class of drugs called histone-deacetylase, or HDAC, inhibitors that fiddle with how DNA is stored and alter how genes are read out and translated into proteins.

The intricacies of how HDAC inhibitors affect gene expression and how those changes reduce seizures and mania are still up in the air. But while some scientists have been working those details out, others have been noticing that HDAC inhibitors help old mice learn new tricks. These drugs allow adult mice to adapt to visual and auditory changes in ways that are only otherwise possible for juvenile mice. In other words, HDAC inhibitors allowed mice to learn things beyond the typical window, or critical period, in which the brain is capable of that specific type of learning.

Judit Gervain, Allan Young, and the other authors of the current study set out to test whether HDAC inhibitors can reopen a learning window in humans as well. They randomly assigned their young male subjects to take valproate for either the first or the second half of the study. (Although I usually get my hackles up about the exclusion of female participants from biomedical studies, I understand their reason for doing so in this case. Valproate can cause severe birth defects. By testing men, the authors could be one hundred percent certain that their participants weren’t pregnant.) The subjects took valproate for one half of the study and a placebo for the other half . . . and of course they weren’t told which was which.

During the first half of the study, they trained twenty-four participants to learn six pitch classes. Instead of teaching them the formal names of these pitches in the twelve-tone musical system, they assigned proper names to each one (e.g., Eric, Rachel, or Francine), indicating that each is the name of a person who only plays one pitch class. The participants received this training online for up to ten minutes daily for seven days. During the second half of the study, eighteen of the same subjects underwent the same training with six new pitch classes and names. At the end of each seven-day training session, they heard the six pitch classes one at a time and, for each, answered the question: “Who played that note?”

fnsys-07-00102-g002

Study results showing better performance at naming tones for participants on valproate in the first half of the experiment. From: Gervain et al, 2013

The results? There was a whopping effect of treatment on performance in the first half of the study. The young men on valproate did significantly better than the men on placebo. That’s pretty cool and amazing. It is particularly impressive and surprising because the participants received very little training. The online training summed to a mere seventy minutes and some of the participants didn’t even complete all seven of the ten-minute sessions.

As cool as the main finding is, there are some odd aspects to the study. As you can see from the figure, the second half of the experiment (after the treatments were switched) doesn’t show the same result as the first. Here, participants on valproate perform no differently from those on placebo. The authors suggest that the training in the first half of the experiment interfered with learning in the second half – a plausible explanation (and one they might have predicted in advance). Still, at this point we can’t tell if we are looking at a case of proactive interference or a failure to replicate results. Only time and future experiments will tell.

There were two other odd aspects of the study that caught my eye. The authors used synthesized piano tones instead of pure tones because the former has additional cues like timbre that help people without perfect pitch complete the task. They also taught the participants to associate each note with the name of the person who supposedly plays it rather than the name of the actual note or some abstract stand-in identifier. Both choices make it easier for the participants to perform well on the task but call into question how similar the participants’ learning is to the specific phenomenon of perfect pitch. Perhaps the subjects on valproate in the first half of the experiment were relying on different cues (e.g., timbre instead of frequency). Likewise, associating proper names of people with notes may help subjects learn precisely because it recruits social processes and networks that people with perfect pitch don’t use for the task. If these social processes don’t have a critical period like perfect pitch judgment does, well then valproate might be boosting a very different kind of learning.

As the authors themselves point out, this small study is merely a “proof-of-concept,” albeit a dramatic one. It is not meant to be the final word on the subject. Still, I am curious to see where this leads. Might valproate’s success with seizures and mania have something to do with its ability to trigger new learning? And if HDAC inhibitors do alter the brain’s ability to learn skills that are typically crystallized by adulthood, how has that affected the millions of adults who have been taking these drugs for years? Yet again, only time and science will tell.

I, for one, will be waiting to hear what they have to say.

_______

Photo credit: Brandon Giesbrecht on Flickr, used via Creative Commons license

Gervain J, Vines BW, Chen LM, Seo RJ, Hensch TK, Werker JF, & Young AH (2013). Valproate reopens critical-period learning of absolute pitch. Frontiers in Systems Neuroscience, 7 PMID: 24348349

Looking Schizophrenia in the Eye

272994276_3c83654e97_bMore than a century ago, scientists discovered something usual about how people with schizophrenia move their eyes. The men, psychologist and inventor Raymond Dodge and psychiatrist Allen Diefendorf, were trying out one of Dodge’s inventions: an early incarnation of the modern eye tracker. When they used it on psychiatric patients, they found that most of their subjects with schizophrenia had a funny way of following a moving object with their eyes.

When a healthy person watches a smoothly moving object (say, an airplane crossing the sky), she tracks the plane with a smooth, continuous eye movement to match its displacement. This action is called smooth pursuit. But smooth pursuit isn’t smooth for most patients with schizophrenia. Their eyes often fall behind and they make a series of quick, tiny jerks to catch up or even dart ahead of their target. For the better part of a century, this movement pattern would remain a mystery. But in recent decades, scientific discoveries have led to a better understanding of smooth pursuit eye movements – both in health and in disease.

Scientists now know that smooth pursuit involves a lot more than simply moving your eyes. To illustrate, let’s say a sexy jogger catches your eye on the street. When you first see the runner, your eyes are stationary and his or her image is moving across your retinas at some relatively constant rate. Your visual system (in particular, your visual motion-processing area MT) must first determine this rate. Then your eyes can move to catch up with the target and match its speed. If you do this well, the jogger’s image will no longer be moving relative to your retinas. From your visual system’s perspective, the jogger is running in place and his or her surroundings are moving instead. From both visual cues and signals about your eye movements, your brain can predict where the jogger is headed and keep moving your eyes at just the right speed to keep pace.

Although the smooth pursuit abnormalities in schizophrenia may sound like a movement problem, they appear to reflect a problem with perception. Sensitive visual tests show that motion perception is disrupted in many patients. They can’t tell the difference between the speeds of two objects or integrate complex motion information as well as healthy controls. A functional MRI study helped explain why. The study found that people with schizophrenia activated their motion-processing area MT less than controls while doing motion-processing tasks. The next logical question – why MT doesn’t work as well for patients – remains unanswered for now.

In my last two posts I wrote about how delusions can develop in healthy people who don’t suffer from psychosis. The same is true of not-so-smooth pursuit. In particular, healthy relatives of patients with schizophrenia tend to have jerkier pursuit movements than subjects without a family history of the illness. They are also impaired at some of the same motion-processing tests that stymie patients. This pattern, along with the results of twin studies, suggests that smooth pursuit dysfunction is inherited. Following up on this idea, two studies have compared subjects’ genotypes with the inheritance patterns of smooth pursuit problems within families. While they couldn’t identify exactly which gene was involved (a limitation of the technique), they both tracked the culprit gene to the same genetic neighborhood on the sixth chromosome.

Despite this progress, the tale of smooth pursuit in schizophrenia is more complex than it appears. For one, there’s evidence that smooth pursuit problems differ for patients with different forms of the disorder. Patients with negative symptoms (like social withdrawal or no outward signs of emotion) may have problems with the first step of smooth pursuit: judging the target’s speed and moving their eyes to catch up. Meanwhile, those with more positive symptoms (like delusions or hallucinations) may have more trouble with the second step: predicting the future movement of the target and keeping pace with their eyes.

It’s also unclear exactly how common these problems are among patients; depending on the study, as many as 95% or as few as 12% of patients may have disrupted smooth pursuit. The studies that found the highest rates of smooth pursuit dysfunction in patients also found rates as high as 19% for the problems among healthy controls. These differences may boil down to the details of how the eye movements were measured in the different experiments. Still, the studies all agreed that people with schizophrenia are far more likely to have smooth pursuit problems than healthy controls. What the studies don’t agree on is how specific these problems are to schizophrenia compared with other psychiatric illnesses. Some studies have found smooth pursuit abnormalities in patients with bipolar disorder and major depression as well as in their close relatives; other studies have not.

Despite these messy issues, a group of scientists at the University of Aberdeen in Scotland recently tried to tell whether subjects had schizophrenia based on their eye movements alone. In addition to smooth pursuit, they used two other measures: the subject’s ability to fix her gaze on a stable target and how she looked at pictures of complex scenes. Most patients have trouble holding their eyes still in the presence of distractors and, when shown a meaningful picture, they tend to look at fewer objects or features in the scene.

Taking the results from all three measures into account, the group could distinguish between a new set of patients with schizophrenia and new healthy controls with an accuracy of 87.8%. While this rate is high, keep in mind that the scientists removed real-world messiness by selecting controls without other psychiatric illnesses or close relatives with psychosis. This makes their demonstration a lot less impressive – and a lot less useful in the real world. I don’t think this method will ever become a viable alternative to diagnosing schizophrenia based on their clinical symptoms, but the approach may hold promise in a similar vein: identifying young people who are at risk for developing the illness. Finding these individuals and helping them sooner could truly mean the difference between life and death.

_____

Photo credit: Travis Nep Smith on Flickr, used via Creative Commons License

Benson PJ, Beedie SA, Shephard E, Giegling I, Rujescu D, & St Clair D (2012). Simple viewing tests can detect eye movement abnormalities that distinguish schizophrenia cases from controls with exceptional accuracy. Biological psychiatry, 72 (9), 716-24 PMID: 22621999

The Demise of the Expert

These days, I find myself turning off the news while thinking the same question. When did we stop valuing knowledge and expertise? When did impressive academic credentials become a political liability? When did the medical advice of celebrities like Jenny McCarthy and Ricki Lake become more trusted than those of government safety panels, scientists, and physicians? When did running a small business or being a soccer mom qualify a person to hold the office of president and make economic and foreign policy decisions?

As Rick Perry, the Republican front-runner for president recently told us, “You don’t have to have a PhD in economics from Harvard to really understand how to get America back working again.” Really? Why not? It certainly seems to me that some formal training would help. And yet many in Congress pooh-poohed economists’ warnings about the importance of raising the debt ceiling and have insisted on decreasing regulations despite the evidence that this won’t help to improve our economy (and will further harm our environment.) Meanwhile, man-made climate change is already affecting our planet. Natural disasters such as droughts and hurricanes are on the rise, just as scientists predicted. But we were slow to accept their warnings and have been slow to enact any meaningful policies to stem the course of this calamity.

The devaluation of expertise is puzzling enough, but perhaps more puzzling still is the timing. Never before in human history have we witnessed the fruits of expertise as we do today. Thanks to scientists and engineers, we rely on cell phones that wirelessly connect us to the very person we want to talk to at the moment we want to talk. In turn, these cell phones operate through satellites that nameless experts have set spinning in precise orbits around Earth. We keep in touch with friends, do our banking and bill-paying, and make major purchases using software written in codes we don’t understand and transmitted over a network whose very essence we struggle to comprehend. (I mean, what exactly is the Internet?) Meanwhile, physicians use lasers to excise tumors and correct poor vision. They replace damaged livers and hearts. They fit amputees with hi-tech artificial limbs, some with feet that flex and hands that grasp.

Obviously none of this would have been possible without experts. You need more than high school math and a casual grasp of physics or anatomy to develop these complex systems, tools, and techniques. So why on Earth would we discount experts now, when we have more proof than ever of their worth?

My only guess is education. Our national public education system is in shambles. American children rank 25th globally in math and 21st in science. At least two-thirds of American students cannot read at grade level. But there is something our student score high on. As the documentary Waiting for Superman highlighted, American students rank number one in confidence. This may stem from the can-do culture of the United States or from the success our nation has enjoyed over the last 65 years. But it makes for a dangerous combination. We are churning out students with inadequate knowledge and skills, but who believe they can intuit and accomplish anything. And if you believe that, then why not believe you know better than the experts?

I think the only remedy for this situation is better education, but not for the reasons you might think. In my opinion, the more a person learns about any given academic subject, the more realistic and targeted his or her self-confidence becomes.

The analogy that comes to mind is of a blind man trying to climb a tree. When he’s still at the base of the tree, all he can feel is the trunk. From there, he has little sense of the size or shape of the rest of the tree.  But suppose he climbs up on a limb and then out to even smaller branches. He still won’t know the shape of the rest of the tree, but from his perch on one branch, he can feel the extensive foliage. He’ll know that the tree must be large and he can presume that the other branches are equally long and intricate. He can appreciate how very much there must be of the tree that’s beyond his reach.

I think the same principle applies to knowledge. The more we know, the more we can appreciate how much else there is out there to know – things about which we haven’t got a clue. As we climb out on our tiny branches, acquiring knowledge, we also gain an awareness of our profound ignorance. Unfortunately, many of America’s children (and by now, adults too) aren’t climbing the tree at all; they’re still lounging at the base, enjoying a picnic in the shade.

Should it surprise us, then, to learn that they don’t see the value in expertise? That they can support political candidates who disparage the advice of specialists and depict academic achievement as a form of elitism? Why shouldn’t they trust the advice of a neighbor, a talk show host, or an actor over the warnings of the ‘educated elite’?

No single person can know everything there is to know in today’s world, so the sum of human knowledge must be dispersed among millions of specialized experts. Human progress relies on these people, dangling from their obscure little branches, to help guide our technology, our public policy, our research and governance. Our world has no shortage of experts. Now if only people would start listening to them.

How the Giraffe Didn’t Get His Long Neck

iStock_000009818096XSmallIt’s the early 19th century, before Darwin’s Origin of Species. Before Mendel’s peas and Watson and Crick’s double helix. Scientists are struggling with the big questions of inheritance and reproduction without the aid of modern scientific methods. In this vacuum of concrete information, odd theories gained traction – some based on racial or social agendas, others on intuition or supposition.

Lamarckism, or soft inheritance, was one of the more pervasive of these ideas. According to the theory, organisms can inherit acquired traits. In the days before Darwin’s evolutionary theory, Lamarckism helped explain why organisms were so well adapted to their environments. Take the example of the giraffe’s long neck. A giraffe of yore (when giraffes had shorter necks) had to stretch its neck to reach the luscious leaves further up on tree branches. All that stretching lengthened its neck a little, and this longer neck was passed on to its offspring, who in turn stretched their necks and sired offspring who could reach even higher and munch the choicest leaves. It went on like this until giraffes were tall enough that they didn’t have to strain to reach leaves anymore.

It was a neat explanation that appealed to many 19th century scientists; even Darwin occasionally made use of it. But the theory had a nasty side as well. People applied it to humans and used it to explain differences between races or socioeconomic classes, calling the phenomenon degeneration. The mental and physical effects of years spent boozing and behaving badly would be passed down from father to son to grandson, each successively worse than his predecessor as the collective sum of each reckless lifetime added up. There was a technical term for the poor souls who wound up literally inheriting the sins of their fathers: degenerates. Certain scientists (or pseudoscientists) of the era, such as Benedict Morel and Cesare Lombroso, used the ideas of soft inheritance and degeneration to explain how violence, poverty, and criminality were heritable and could be categorized and studied.

Lamarckism, in the hands of Morel and others, offered a credible explanation of why the son of an alcoholic was more likely to be an alcoholic himself. But it did so by implying that the poor, the miserable, the suffering were inherently inferior to those with better, healthier (and probably wealthier) lifestyles. The poor were genetically degenerate, and they had no one to blame but themselves.

Thank god, thank god, Lamarckism and its corollary, degeneration, were debunked. By the 20th century, scientists knew that inheritance didn’t work that way. Our genetic information isn’t changed by what we do during our lifetimes. Besides, our sex cells are segregated from the other cells in our bodies. We don’t descend from our mothers, subject to all the stresses, strains, and yes, even boozing that their brains and bodies may have experienced. Instead, we descend from their ovaries. And thankfully, those things are well protected.

Only there’s a catch. In the last few decades, we’ve learned that while Lamarckism isn’t correct, it isn’t entirely wrong either. We’ve learned this through the field of epigenetics (literally, above genetics). This burgeoning field has helped us understand why the causes of so many heritable diseases still elude us, nearly a decade after we sequenced the human genome. Epigenetics adds untold complexity to an already complex genome. Some of its mechanisms are transient, others last a lifetime, but they all regulate gene expression and are necessary for normal growth and development. Thanks to them, females inactivate one of their X chromosomes (so women don’t get a double dose of proteins from that set of genes). Epigenetic mechanisms also oversee cellular differentiation, the process by which embryonic cells containing identical genetic material become skin cells, hepatocytes, neurons, and every other diverse cell type in the human body.

It now appears that epigenetic factors play an enormous role in human health. And what we do in our lives, the choices we make, affect our epigenome. Exposure to chemicals, stressors, or dietary changes can cause long-lasting tags to sit on our DNA or chromatin, controlling which genes are read and transcribed into proteins. For example, chronic cocaine use causes lasting epigenetic changes in the nucleus accumbens, a brain area linked to addiction. These changes boost plasticity and drug-related gene expression, which in turn probably contribute to the reinforcing effects of the drug.

But that’s not all. Epigenetic effects can span generations. No, the hardships of your parents’ lifetimes aren’t literally passed on to you in a cumulative fashion, giving you that longer neck or boozier disposition that Lamarckism might predict. Nonetheless, what your parents (and even your grandmother) did before you were born can be affecting your epigenome today.

It’s pretty wild stuff. Even if you’ve never met your maternal grandmother, even if she died long before your birth, her experiences and behavior could be affecting your health. First of all, the prenatal environment your mother experienced can have epigenetic effects on her that then propagate on to the next generation (you). Moreover, all the eggs a female will ever make have already formed in her ovaries by the time she’s born. They may not be mature, but they are there, DNA and all. I think that’s a pretty amazing transgenerational link. It means that half the strands of DNA that wound up becoming you were initially made inside your grandmother’s body. As science reveals the power of the prenatal environment, evidence is mounting that even what your grandmother ate during your mother’s gestational period and whether she suffered hardships like famine can alter your own risk for heart disease or diabetes.

Luckily, epigenetic gene regulation is softer and less absolute than its cousin Lamarckism. It is reversible and it can’t accumulate, generation upon generation, to create a degenerate class. The science of today is more humane than the old guys predicted, but it doesn’t let us off the hook. Epigenetics should remind us that we must be thoughtful in how we live. Our choices matter, for ourselves and for our offspring. We don’t yet understand how epigenetic mechanisms control our health and longevity, but that isn’t stopping our bodies from making us pay for what we do now.

West Side Story 2: Neurologists And Psychiatrists Rumble

If something’s wrong with your brain, should you see a psychiatrist or a neurologist? The answer to that question depends on whether modern medicine can tell you why you’re sick.

Here’s the simplified breakdown these days: people suffering from neurodegeneration, stroke, or traumatic brain damage see neurologists. People with anxiety disorders, depression, or psychosis see psychiatrists.

Fine, but how was this distinction made? Is it based on the symptoms of the illnesses?

We tend to think that neurologists see patients with motor (e.g., Huntington’s) or cognitive (e.g., Alzheimer’s) impairments, yet psychiatric illnesses can include motor and cognitive symptoms (e.g., schizophrenia).

And we tend to think that psychiatrists see people with emotional problems, yet neurological illnesses can have emotional symptoms. For example, depression often precedes motor problems in Parkinson’s disease.

No, the distinction is based on what we know and can see. Neurology covers observable or testable brain diseases (although sometimes they can’t observe the pathology until after death). Neurologists can locate the damaged tissue from a stroke, the aberrant neural activity characteristic of epilepsy, or the subcellular aggregate proteins that indicate certain types of degeneration. This is considered real medicine, based on hard science. And when the causes, or at least the pathology, of an illness are known, medicine can more systematically diagnosis and treat it.

That’s what neurologists have to work with. The leftovers go to psychiatrists. These are the illnesses with no overt physical pathology. Even though something is clearly wrong with a psychotic individual, we don’t know exactly how or where in the brain to look for definitive evidence, and not knowing means we’re in the dark on treatment. We use the drugs that work without knowing why they do.

In the course of my interactions with both scientists and physicians, many have expressed the view that the field of psychiatry is essentially voodoo. They say that psychiatrists throw drugs at patients without rhyme or reason, and that researchers of mental illnesses search in vain for the causes of loosely classified, ill-defined disorders.

Clearly this assessment isn’t fair. Psychiatry is left with only those diseases we don’t understand and can’t see. Thus, treatment is a guessing game. Then, when psychiatrists step up and try to help the millions of suffering patients, they’re pooh-poohed for the messy, hand-waving black magic of it all.

Once scientists find reliable physiological markers for mental illnesses, psychiatrists can have their laugh. That is, until the neurologists take their patients.

A New Pair Of Eyes

Transplants, transplants, everywhere.

This is an amazing time in modern medicine. It seems like nearly anything a dead person can have and a living person can need is transplantable – even faces, as we’ve seen of late.

We hear a lot about heart, liver, and kidney transplants, but much less is said about the most common type of transplant in the U.S. (about 40,000 per year.)

Corneal transplantation is relatively easy and safe. Since the cornea is not blood-infused like the liver or heart, there’s no need to match donors with recipients; any cornea can go to any patient who needs one. And because the surgery is relatively noninvasive, it can be done as an outpatient procedure. Patients spend a few days with an eye patch, then they’re good to go.

It’s such a minor, common surgery that I hadn’t heard of it until my dad became a donor. A few months after his death, my family received a card in the mail. It said that a woman in Oak Park could now see because of my father’s donation.

I think about that often. Even though he’ll never see the world again, he’s given her a window so that she can see.

Weird science? Yes. But beautiful.

%d bloggers like this: