Modernity, Madness, and the History of Neuroscience

4666194636_a4d78d506e_o

I recently read a wonderful piece in Aeon Magazine about how technology shapes psychotic delusions. As the author, Mike Jay, explains:

Persecutory delusions, for example, can be found throughout history and across cultures; but within this category a desert nomad is more likely to believe that he is being buried alive in sand by a djinn, and an urban American that he has been implanted with a microchip and is being monitored by the CIA.

While delusional people of the past may have fretted over spirits, witches, demons and ghouls, today they often worry about wireless signals controlling their minds or hidden cameras recording their lives for a reality TV show. Indeed, reality TV is ubiquitous in our culture and experiments in remote mind-control (albeit on a limited scale) have been popping up recently in the news. As psychiatrist Joel Gold of NYU and philosopher Ian Gold of McGill University wrote in 2012: “For an illness that is often characterized as a break with reality, psychosis keeps remarkably up to date.”

Whatever the time or the place, new technologies are pervasive and salient. They are on the tips of our tongues and, eventually, at the tips of our fingers. Psychotic or not, we are all captivated by technological advances. They provide us with new analogies and new ways of explaining the all-but-unexplainable. And where else do we attempt to explain the mysteries of the world, if not through science?

As I read Jay’s piece on psychosis, it struck me that science has historically had the same habit of co-opting modern technologies for explanatory purposes. In the case of neuroscience, scientists and physicians across cultures and ages have invoked the  innovations of their day to explain the mind’s mysteries. For instance, the science of antiquity was rooted in the physical properties of matter and the mechanical interactions between them. Around 7th century BC, empires began constructing great aqueducts to bring water to their growing cities. The great engineering challenge of the day was to control and guide the flow of water across great distances. It was in this scientific milieu that the ancient Greeks devised a model for the workings of the mind. They believed that a person’s thoughts, feelings, intellect and soul were physical stuff: specifically, an invisible, weightless fluid called psychic pneuma. Around 200 AD, a physician and scientist of the Roman Empire (known for its masterful aqueducts) would revise and clarify the theory. The physician, Galen, believed that pneuma fills the brain cavities called ventricles and circulates through white matter pathways in the brain and nerves in the body just as water flows through a tube. As psychic pneuma traveled throughout the body, it carried sensation and movement to the extremities. Although the idea may sound farfetched to us today, this model of the brain persisted for more than a millennium and influenced Renaissance thinkers including Descartes.

By the 18th century, however, the science world was a-buzz with two strange new forces: electricity and magnetism. At the same time, physicians and anatomists began to think of the brain itself as the stuff that gives rise to thought and feeling, rather than a maze of vats and tunnels that move fluid around. In the 179os, Luigi Galvani’s experiments zapping frog legs showed that nerves communicate with muscles using electricity. So in the 19th century, just as inventors were harnessing electricity to run motors and light up the darkness, scientists reconceived the brain as an organ of electricity. It was a wise innovation and one supported by experiments, but also driven by the technical advances of the day.

Science was revolutionized once again with the advent of modern computers in the 1940s and ‘50s. In the 1950s, the new technology sparked a surge of research and theories that used the computer as an analogy for the brain. Psychologists began to treat mental events like computer processes, which can be broken up and analyzed as a set of discrete steps. They equated brain areas to processors and neural activity in these areas to the computations carried out by computers. Just as computers rule our modern technological world, this way of thinking about the brain still profoundly influences how neuroscience and psychology research is carried out and interpreted. Today, some labs cut out the middleman (the brain) entirely. Results from computer models of the brain are regularly published in neuroscience journals, sometimes without any data from an actual physical brain.

I’m sure there are other examples from the history of neuroscience in general and certainly from the history of science as a whole. Please comment and share any other ways that technology has shaped the models, themes, and analogies of science!

Additional sources:

Crivellato E & Ribatti D (2007) Soul, mind, brain: Greek philosophy and the birth of neuroscience. Brain Research Bulletin 71:327-336.

Karenberg A (2009) Cerebral Localization in the Eighteenth Century – An Overview. Journal of the History of the Neurosciences, 18:248-253.

_________

Photo Credit: dominiqueb on Flickr, available through Creative Commons

12 responses

  1. I’m a counselor and I work with kids who have ADHD, Autism, and other neurologically based issues. I use a computer analogy all the time to explain concepts like working memory and processing speed to parents I work with. Great post! Keep up the good work!

  2. Pingback: The Metaphors Of The Mind | The Penn Ave Post

  3. About thirty years ago, I wrote the following paragraph in a book about bioethical decision making:
    “Scientists have a long history of explaining human bodily functions in terms of state-of-the-art technology. In fact, one major reason William Harvey was able to succeed where all others failed was the recent invention of the pump. By drawing an analogy between mechanical pumping systems and the human circulatory system, Harvey was finally able to explain how the blood was propelled through the body. In a like manner, modern scientists are tempted to explain the workings of the human brain by analogy to the computer (often referred to as an electronic brain). Although the analogy is far from straightforward, the dominant view today is that the brain controls bodily functions in much the same way that a computer controls an automated factory. So it was only natural to look to the brain (the master controller) for a new definition of death.”

    Harvey’s analogy of the heart to the pump is almost perfect. Not so with the brain and the computer. In the past 30 years neuroscience has learned so much new about the brain that the analogy seems to do as much to confound, as to enhance, any real understanding.

    • “In the past 30 years neuroscience has learned so much new about the brain that the analogy seems to do as much to confound, as to enhance, any real understanding.”

      Oh? Care to provide any specific examples of this?

  4. Given that this blog’s convener is a neuroscientist, I don’t want to get too far into the weeds, but I’d suggest three broad areas of research which suggest that the digital computer may not be the best model of our brain/mind:

    1. The phenomenon of consciousness. One good place to start is Searle’s Chinese Room argument and the surrounding discussion. http://plato.stanford.edu/entries/chinese-room/ Is consciousness simply an emergent property once the computer gets complex enough?

    2. The digital computer is really a massive array of switches that can be on/off. That’s how electricity works. But we know that the brain is not just electrical, it also uses different biochemical neurotransmitters to transmit information/bridge synapses. As such, the brain is capable of degrees of on/off, acting more analog than digital. It’s difficult to understand how the current computer architecture can be very useful in explaining these aspects of brain operation.

    3. Recent research reveals the tremendous plasticity of the brain: information seems stored multiply and the brain seems to have the capacity to repair itself if one area is damaged. Some writers have referred to this as the holographic brain. Compare this to a digital computer; if a sector of the hard drive is destroyed, it’s adios to the data. It’s not just that one is alive and the other dead, it seems that our brains use and store information in a vastly different way than in the digital computer.

    • 1. The Chinese room thing is sort of terrible, actually, and the questions you’re asking will be resolved empirically in good time by using the computer paradigm. Philosophers who make a big deal out of consciousness are out of their depth.

      2. This is actually completely wrong. Electricity doesn’t “work” in either 1 or 0. You can have degrees of electricity. Also, computers don’t necessarily work in a strictly binary fashion, either. And, of course, there is such a thing as a chemical computer. So, again, everything here is wrong.

      3. It’s not like we can’t build this sort of computer in theory, we just haven’t bothered to do so yet in reality because there’s been no need. Redundancy, plasticity – none of that stuff is impossible for computers. What you’re talking about would be a fun challenge for a data storage specialist, but it’s hardly fatal.

      Basically, from what you’re saying it looks like you’re working with a very naive and simplistic concept of what counts as “the digital computer.” If that’s your point – i.e., if you’re saying that computer language leads us to think naively and simplistically – then, well, if nothing else you’ve proven your point. But I think it would be far more useful to respond instead by taking a more nuanced, advanced, technically literate position on what a “computer” is.

      • Analogies are never a perfect fit; if they were they’d be duplicates instead. The analogy of a computer (at least as we know it in its current form) doesn’t align with every aspect of the human brain. Still, I think on the whole it has advanced (and will continue to advance) neuroscience a great deal. Most scientists are smart enough to know that all analogies have their limitations and apply them with care.

      • Right, precisely. The analogy between a heart and a mechanical pump isn’t a perfect analogy, either, for any number of reasons, but people establish the limits of the analogy by looking carefully at each side. I’m heartened that some people are already doing that with the brain/computer thing, and I rather hope that their numbers grow.

  5. Pingback: Page not found « Garden of the Mind

  6. No analogy is perfect – and that’s a truth. That said, I recently read a book called “The Informaton: A Theory, A History, A Flood” by James Gleick, which is what pushed me to comment on your excellent article.

    Information theory lies at the heart of modern neuroscience, because (as Gleick points out in his must read book) the *concept* of the modern computer preceded its invention. The conceptual computer, whether Babbage’s Analytical Engine or Turing’s Universal Machine were first and foremost processors of information. This fundamental rooting in information theory is what makes computers what they are – information processors, and defines mind as another information processor, and therefore analogous to computers. The very idea of “information” as a measurable entity (measured in “bits” and bytes) was a revolutionary concept that moved science away from thinking about the universe as a collection of atoms moving about to start moving towards a place where the corporeal and non-corporeal start to merge in the realms of Quantum theory.

    For psychiatry, and for the study of mind, this concept was no less revolutionary. Information theory led to us thinking of mind and consciousness in terms of emergent structures built out of information first and foremost. Before information theory, the mind was either explained away by a soul/ body duality, or as manifestations of chemical interactions and physical processes that could be explained entirely by the paradigms you mention – fluid dynamics for the Graeco-Romans, electricity for the ‘moderns’.

    I’d submit therefore that the analogy of mind to “computer” is less about actual computers, but more to do with the abstract computer Turing had imagined, one that is first and foremost an information processing entity, itself emerging from ‘memorized’ information…

    • Thank you for this lovely and insightful comment! I think you are absolutely right that it’s the idea of quantized information and discrete transformations of this information rather than the physical computer itself that lies at the heart of the cognitive revolution and modern thinking in neuroscience. Although in certain circles, especially those studying the physical layout of areas in the brain, some have started using the analogy of the physical computer and how component placement based on function is crucial to minimize wiring and keep the computer (or brain) from being unreasonably large.

      Thanks again for your insightful comment. Please keep reading and sharing your thoughts!

Leave a comment