SESSIONS: COURSES

Half a cup of Neuromancer: Are we tuned to a dead channel?

“The sky above the port was the color of television, tuned to a dead channel.”- Neuromancer, William Gibson

Imagine being forced to watch a 24-hour advertisement channel. Worse, imagine living in one. The day of perpetual, enveloping and perceptually subliminal advertisement driven world is not far away. As a first step towards the cyberpunk world described in Gibson’s Neuromancer, BSkyB in the UK will start airing the Advert Channel come September quoting research that says “There's a huge demand for people wanting to see adverts." Technology has currently not developed to a stage where it could turn our world into Advert Channel… until now.

Consider this: The Moore’s law states that computing power doubles every 24 months. Storage doubles every 12 months. However, the human brain volume doubles every 3x10^7 months. We are at that defining moment of history where technology is overtaking human cognitive capabilities. As we learn more about the way we perceive the world around us, we could end up being the target of manipulation at the hands of giant corporate entities.

The course “Seeing, Hearing and Touching: Putting It All Together,” offered by researchers from the University of British Columbia, introduced the state of the art in human perceptual theory and its application to design and testing of interaction. Display technologies have matured to a point where there are more pixels on the display than what can be perceived by a human. We have sound systems that can produce high fidelity replications of recorded music. We have conquered the physical domain. The battlefield of the future, and the source of many more SIGGRAPH papers, is the human mind.

Hearing Ron Resnick speak would leave you wondering about the sanity of your senses. Can you really believe what you see? Or, do you see what you want to believe? Consider the flicker paradigm: Two images are displayed alternately for short bursts of time. Changes to the images made under these conditions are extremely difficult to notice, even when the changes are large, anticipated and repeatedly made. This is called change blindness.

One could speculate that the human brain, in its quest to overcome its limited processing power, performs optimizations during the process of visual perception. This rapid perception of scene gist can be determined by a single 120ms exposure. Perceiving changes in a scene is a whole different ball game. Attention is needed to see (visually experience) a change in an object. Attention, as we all know, can be easily diverted. We have all experienced the Cocktail Syndrome. We are talking to a group of people, exchanging information, following a thread of conversation, and suddenly we hear our name spoken in the adjacent group. Our attention is diverted and we lose track of the original conversation.

We also suffer what Ron calls “change simultagnosia”: we cannot “see” more than one change at a time. The Vision Cognition Lab at the University of Illinois, Urbana Champaign, has been researching this for quite some time. One of the videos shows a subject witnessing a "person change" (Simons and Levin, 1998). In this clip, an experimenter approaches a pedestrian to ask for directions. While the pedestrian was giving directions, two additional experimenters, carrying a door, rudely passed between the initial experimenter and the pedestrian. During this brief interruption, the original experimenter was replaced by a different person. Even though the two experimenters looked quite different and had distinctly different voices, approximately 50 percent of the subjects failed to notice that they were talking to a different person after the door passed. One cannot help but wonder if our lives are indeed a big social experiment. How can we tell that we are not living in version 2 of Alex Proyas’ Dark City?

Human perception is often subtle. Sometimes we “feel” change. Most of us believe in the sixth sense, or Mindsight. It is our mind’s way of dealing with the meta-cognitive gap. If the change is too gradual or too fast to be noticed, or if our senses relay conflicting information, the brain tries to “fill-in” the blanks. Research into mindsight could lead to more subtle forms of manipulation. A department store aisle can be arranged to pass on subliminal hints to a shopper to pick up junk he doesn’t need. Or to make 12 pack cans of soda irresistible to ten year olds.

Research into perception could lead to devices that let you offload cognition. Just because objects appear to be present simultaneously, they do not all need to be represented simultaneously. Imagine an NASCAR driver wearing goggles that filter out distractions, or provides pit-lane traffic information real-time.

What is true for human vision, is observed to be true for the other senses as well. And extending the CG world to haptics (science of touch) could lead to immersive worlds where Newtonian laws don’t apply any more. And the interplay between our senses is still being charted. We now know that sound interacts with touch at low frequencies. We know what part of the spectrum is the least perceived and use this information for compressing music. We are beginning to scientifically understand how music affects mood, something that the artist community has known for a long time.

The technology spawned from this research could be used for commercial advertising. With the growing penetration and sophistication of cell phones (that now want to be a camera, an MP3 player, your organizer and instant messenger), your personal information is just a radio-chip away. Times Square could be customized to each viewer. We could live in a reality that is our own. Funny how we lock up people who do that now.

However, not everybody shares this dark view of the future. There are people who would put this research to better use. One exciting area would be education. Lectures continue to be boring in spite of the importance society gives to higher education. Distant education has remained a distant dream due to obvious shortcomings. Research into how humans perceive information could lead to better delivery of information. Public speaking and teaching are prime beneficiaries. Nicola Martinez from the Empire State College, Saratoga Springs, New York, plans to apply some of the research for designing distance learning curricula.

Once we understand (how do we know that we haven’t already?) how Seeing, Hearing and Touching works, we could essentially fulfill Gibson’s vision of “jacking-in”.

 

 

 

ADDITIONAL INFORMATION

related links

  • Neuromancer
  • BSkyB's Advert Channel
  • Course Description: Seeing, Hearing and Touching: Putting It All Together
  • Alex Proyas’ Dark City

related reports


 

This site is maintained by ACM SIGGRAPH Reports.
Last updated 8/10/04.

The ACM SIGGRAPH Reporter program is sponsored by ACM SIGGRAPH.

Live coverage of SIGGRAPH made possible by the generous loan of of Cybershot digital cameras from SONY, and laptop computers from NCSA.