Eyes Wide Shut

3717066825_cf1b3f86a3_o

In the middle of the 20th century, experimental psychologists began to notice a strange interaction between human vision and time. If they showed people flashes of light close together in time, subjects experienced the flashes as if they all occurred simultaneously. When they asked people to detect faint images, the speed of their subjects’ responses waxed and waned according to a mysterious but predictable rhythm. Taken together, the results pointed to one conclusion: that human vision operates within a particular time window – about 100 milliseconds, or one-tenth of a second.

This discovery sparked a controversy about the nature of vision. Pretty much anyone with a pair of eyes will tell you that vision feels smooth and unbroken. But is it truly as continuous as it feels, or might it occur in discrete chunks of time? Could the cohesive experience of vision be nothing more than an illusion?

Enthusiasm for the idea of discrete visual processing faded over the years, although it was never disproven. Science is not immune to fads; ideas often fall in and out of favor. Besides, vision-in-chunks was a hard sell. It was counterintuitive and contrary to people’s subjective experience. Vision scientists set it aside and moved on to new questions and controversies instead.

The debate resurfaced in the last twenty years, sparked by the discovery of a new twist on an old optical illusion. Scientists have long known about the wagon wheel illusion, which makes it appear as if the wheels of moving cars (or wagons) in films are either turning in the wrong direction or not turning at all. The illusion is caused by a technical glitch: the combination of the periodic rotating wheel and the frame rate of the movie. Your brain doesn’t get enough examples of the spinning wheel to know its direction and speed. But in 1996, scientists discovered that the illusion also occurred in the real world. When hubcaps, tires, and modified LPs turned at certain rates, their direction appeared to reverse. Scientists dug the idea of discrete vision out of a trunk in the attic, dusted it off, and tried it out to explain the effect. In essence, the visual system might have a frame rate of its own. Cross this frame rate with an object rotating at a certain frequency and you’re left seeing tires spin backwards. It seemed to make sense.

In a clever set of experiments, the neuroscientist and author David Eagleman (of Incognito and Sum fame) shot this explanation down. He and his colleague, Keith Kline, chalked the illusion up to tiring motion-processing cells instead. Still, the debate about the nature of vision was reignited. Several neuroscientists became intrigued with the notion of vision-in-chunks and began to think about it in relation to a particular type of brain rhythm that cycles at a rate of – you guessed it – about ten times per second.

In recent years, a slew of experiments have supported the idea that certain aspects of vision happen in discrete packets of time – and that these packets are roughly one-tenth of a second long. The brain rhythms that correspond to this timing – called alpha waves – have acted as the missing link. Brain rhythms essentially tamp down activity in a brain area at a regular interval, like a librarian who keeps shushing a crowd of noisy kids. Cells in a given part of the brain momentarily fall silent but, as kids will do, they start right up again once the shushing is done.

Work by Rufin VanRullen at the Université de Toulouse and, separately, by Kyle Mathewson at the University of Illinois show how this periodic shushing can affect visual perception. For example, Mathewson and colleagues were able to predict whether a subject would detect a briefly flashed circle based on its timing relative to the alpha wave in that subject’s visual cortex. This and other studies like it demonstrate that alpha waves are not always helpful. If something appears at the wrong moment in your rhythm, you could be slower to see it or you might just miss it altogether. In other words, every tenth of a second you might be just a little bit blind.

If you’re a healthy skeptic, you may be wondering how well such experiments reflect vision in the real world. Unless your computer’s on the fritz, you probably don’t spend much time staring at circles on a screen. Does the 10-per-second frame rate apply when you’re looking at the complex objects and people that populate your everyday world?

Enter Frédéric Gosselin and colleagues from the Université de Montréal. Last month they published a simple study in the journal Cognition that tested the idea of discrete vision using pictures of human faces. They made the faces hard to see by bathing them in different amounts of visual ‘noise’ (like the static on a misbehaving television). Subjects had to identify each face as one of six that they had learned in advance. But while they were trying to identify each face, the amount of static on the face kept changing. In fact, Gosselin and colleagues were cycling the amount of static to see how its rate and phase (timing relative to the appearance of each new face) affected their subjects’ performance. They figured that if visual processing is discrete and varies with time, then subjects should perform best when their moments of best vision coincided with the moments of least static obscuring the face.

What did they find? People were best at identifying the faces when the static cycled at 10 or 15 times per second. Gosselin and colleagues suggest that the ideal rate may be somewhere between the two (a possibility that they can’t test after-the-fact). Their results imply that the visual alpha wave affects face recognition – a task that people do every day. But it may only affect it a little. The difference between the subjects’ best accuracy (when the static cycling was set just right) and their worst accuracy was only 7%. In the end, the alpha wave is one of many factors that determine perception. And even when these rhythms are shushing visual cortex, it’s not enough to shut down the entire area. Some troublemakers keep yapping right through it.

When it comes to alpha waves and the nature of discrete visual processing, scientists have their work cut out for them. For example, while some studies found that perception was affected by an ongoing visual alpha wave, others found that visual events (like the appearance of a new image) triggered new alpha waves in visual cortex. In fact, brain rhythms are not by any means exclusive; different rhythms can be layered one upon the other within a brain area, making it harder to pull out the role of any one of them.  For now it’s at least safe to say that visual processing is nowhere near as smooth and continuous as it appears. Your vision flickers and occasionally fails. As if your brain dims the lights, you have moments when you see less and miss more – moments that may happen tens of thousands of times each hour.

This fact raises a troubling question. Why would the brain have rhythms that interfere with perception? Paradoxically enough, discrete visual processing and alpha waves may actually give your visual perception its smooth, cohesive feel. In the last post I mentioned how you move your eyes about 2 or 3 times per second. Your visual system must somehow stitch together the information from these separate glimpses that are offset from each other both in time and space. Alpha waves allow visual information to echo in the brain. They may stabilize visual representations over time, allowing them to linger long enough for the brain, that master seamstress, to do her work.

_____

Photo credit: Tom Conger on Flickr with Creative Commons license

Blais C, Arguin M, & Gosselin F (2013). Human visual processing oscillates: Evidence from a classification image technique Cognition, 128 (3), 353-62 PMID: 23764998

Sight Unseen

6238705478_384842373b_o

Eyelids. They come in handy for sandstorms, eye shadow, and poolside naps. You don’t see much when they’re closed, but when they’re open you have an all-access pass to the visible world around you. Right? Well, not exactly. Here at Garden of the Mind, the next two posts are dedicated to the ways that you are blind – every day – and with your eyes wide open.

One of the ways you experience everyday blindness has to do with the movements of your eyes. If you stuck a camera in your retina and recorded the images that fall on your eye, the footage would be nauseating. Think The Blair Witch Project, only worse. That’s because you move your eyes about once every half a second – more often than your heart beats. You make these eye movements constantly, without intention or even awareness. Why? Because, thanks to inequalities in the eye and visual areas of the brain, your peripheral vision is abysmal. It’s true even if you have 20/20 vision. You don’t sense that you are legally blind in your peripheral vision because you compensate by moving your eyes from place to place. Like snapping a series of overlapping photographs to create a panoramic picture, you move your eyes to catch different parts of a scene and your brain stitches these ‘shots’ together.

As it turns out, the brain is a wonderful seamstress. All this glancing and stitching leaves us with a visual experience that feels cohesive and smooth – nothing like the Frankenstein creation it actually is. One reason this beautiful self-deception works is that we turn off much of our visual system every time we move our eyes. You can test this out by facing a mirror and moving your eyes quickly back and forth (as if you are looking at your right and left ears). Try as you might, you won’t be able to catch your eyes moving. It’s not because they’re moving too little for you to see; a friend looking over your shoulder would clearly see them darting back and forth. You can feel them moving yourself if you gently rest your fingers below your lower lashes.

It would be an overstatement to say that you are completely blind every time you move your eyes. While some aspects of visual processing (like that of motion) are switched off, others (like that of image contrast) seem to stay on. Still, this means that twice per second, or 7,200 times each hour, your brain shuts you out of your own sense of sight.  In these moments you are denied access to full visual awareness. You are left, so to speak, in the dark.

Photo credit: Pete Georgiev on Flickr under Creative Commons license

Feeling Invisible Light

7401773382_19963f6a8b_cIn my last post, I wrote about whether we can imagine experiencing a sense that we don’t possess (such as a trout’s sense of magnetic fields). Since then a study has come out that adds a new twist to our little thought experiment. And for that we can thank six trailblazing rats in North Carolina.

Like us, rats see only a sliver of the full electromagnetic spectrum. They can perceive red light with wavelengths as long as about 650 nanometers, but radiation with longer wavelengths (known as infrared, or IR, radiation) is invisible to them. Or it was before a group of researchers at Duke began their experiment. They first trained the rats to indicate with a nose poke where they saw a visible light turned on. Then the researchers mounted an IR detector to each rat’s head and surgically implanted tiny electrodes into the part of its brain that processes tactile sensations from its whiskers.

After these sci-fi surgeries, each rat was trained to do the same light detection task again – only this time it had to detect infrared instead of visible light. Whenever the IR detectors on the animal’s head picked up IR radiation, the electrodes stimulated the tactile whisker-responsive area of its brain. So while the rat’s eyes could not detect the IR lights, a part of its brain was still receiving information about them.

Could they do the new task? Not very well at first. But within a month, these adult rats learned to do the IR detection task quite well. They even developed new strategies to accomplish their new task; as these videos show, they learned to sweep their heads back and forth to detect and localize the infrared sources.

Overall, this study shows us that the adult brain is capable of acquiring a new or expanded sense. But it doesn’t tell us how the rats experienced this new sense. Two details from the study suggest that the rats experienced IR radiation as a tactile sensation. First, the post-surgical rats scratched at their faces when first exposed to IR radiation, just as they might if they initially interpreted the IR-related brain activity as something brushing against their whiskers. Second, when the scientists studied the activity of the touch neurons receiving IR-linked stimulation after extensive IR training, they found that the majority responded to both touch and infrared light. At least to some degree, the senses of touch and of infrared vision were integrated within the individual neurons themselves.

In my last post, I found that I was only able to imagine magnetosensation by analogy to my sense of touch. Using some fancy technology, the scientists at Duke were able to turn this exercise in imagination into a reality. The rats were truly able to experience a new sense by piggybacking on an existing sense. The findings demonstrate the remarkable plasticity of the adult brain – a comforting thought as we all barrel toward our later years – but they also provide us with a glimpse of future possibilities. Someday we might be able to follow up on our thought experiment with an actual experiment. With a little brain surgery, we may someday be able to ‘see’ infrared or ultraviolet light. Or we might just hook ourselves up to a magnificent compass and have a taste (or feel or smell or sight or sound) of magnetosensation after all.

____

Photo credit: Novartis AG

ResearchBlogging.org

Thomson EE, Carra R, & Nicolelis MA (2013). Perceiving invisible light through a somatosensory cortical prosthesis. Nature communications, 4 PMID: 23403583

%d bloggers like this: