Retina Burn
The sun is the source of all vision, but looking directly at the sun burns the retina. The audio composition Retina Burn allows listeners to experience the interplay of the human eye with the powerful energy of the sun, examining detail through sound rather than light. In this work, propogating sound waves in the sun are altered and manipulated by live movements of the eye to create an ambient, otherworldly landscape of sound.
Performing Vision
In 1994, I began an exploration of a kind of ‘active vision’ using video eye tracking technology to perform sound and music. I created a custom designed instrument for sound performance with eye movements, and performed with various versions of this instrument for over five years, in a project called Active Vision, producing several recordings including Retina Burn in which eye movements manipulate Helioseismology, the sun’s sound waves. The initial inspiration for this project came from a sculptural installation artist who had experienced a traumatic fall and had become paralyzed. I wanted to explore inexpensive developing technologies that might help this artist to more easily communicate and discovered software video eye tracking. Although the artist recovered significantly through physical therapy and never needed to use this technology, I found myself absorbed in the possibilities and limitations of communication through the eyes.
Another motivation for working with eye tracking in the early 90’s came from experiences I had in distance learning and communication. I had been invited to teach one of the first courses at a local Chicago college using a videoconferencing system. In this case I had 20 in-person students in a home classroom and ten students at a remote site. Although the videoconferencing system was state of the art, and not much different from those in use today, I found effective communication with the remote students to b e extremely challenging. This struggle made me wonder how important the subtleties of eye contact are in communication and if there was a way in which technology could extend the eye across long distances.
My experiences with eye tracking as a performance controller made me curious about the vision process and I began to investigate historical understandings of that process. I found that early philosopher-scientists thought of vision as an active process, for example Democritus believed that to see an object, some part of that object must come into physical contact with the eye. This idea is referred to as intromission theory. He believed that an object in space presses the air between the eye and the object, and this air holds the color of the object that travels to the eyes, resulting in vision. [1] This pressing of air seemed to me to be similar to the actual process that occurs during hearing, where the sounding object creates oscillating air pressure that then touches and moves the eardrum.
Plato’s extramission theory took an even more active stance. In his view, vision occurs when light comes out of the eye and hits objects. Plato imagined this light as a ball of fire emanating from the eye and combining with sunlight. In response, the object of view releases ‘flame particles,’ in other words the colors that we see. In the 10th century the great thinker of optics Alhazen debunked extramission theory. He did this by reasoning since the eyes burn when looking at brighter objects such as the sun, it is not possible that eyes themselves can emit a flame. [2]
The image of Plato’s flame particles combined with Alhazen’s discussion of the dangers of looking at bright objects served as inspiration for my Retina Burn experiments. I was interested in the relationship between vision and action that I was experiencing with others through my computer screen. In retrospect, I think I began working with eye tracking to relate metaphorical connections in my mind between the computer screen and the retinal image, i.e. the screen is made of a flat grid of pixels, and the retinal image is a flat projection of color and pattern. I found through my research that the retinal image itself lacks the depth and meaning of the real world; it is only through the process of interpretation that an understanding of the world is formed. In a similar way, information stored on the computer has no real meaning until it is interpreted, and it must not only be interpreted by the computer program, but also interpreted by a human interacting with the machine. In my work with eye tracking, the image of the eye, usually the receiver rather than the transmitter of an image, is received by the computer via its video capture card. The computer than takes this very material information (the bits and bytes that make up the image) in real time and translates it into sound, which is then perceived by the ears of the viewer.
The Responsive Screen
Because of my interest in the structure and material of the screen image, as part of the Active Vision research I began to look into the origins of the mechanical representation of motion. I found that the science of vision played an important role in the development of imaging technology. Towards the end of the 19th century a technology that created an almost complete dissolution of the boundaries of art and science was photography. This imaging technology created a representation of the ‘real’ never before seen. The chemical accuracy of a photograph proved to be scientifically measurable, and early on in the development of photographic technology, it became clear that photographic images could actually be used to reveal truths beyond the static retinal image. In particular, photographs could fix motion and reveal patterns of movement. In the early 1880s Etienne-Jules Marey experimented with the portrayal of motion using photographic technology. He pioneered a method of moving unexposed film called "chronophotography" and used this technology to study the mechanics of motion.
Marey’s work helped to bring about motion picture technology, but some of his more interesting works are still images that attempt to isolate the 'purity' of motion by dressing models in all black with white stripes, dots, and electric lights placed lengthwise along the limbs and at axis points. This work may have inspired other artists like Marcel Duchamp and art movements like Futurism to explore new ways of portraying the complex movement of figures and objects in space. Marey's interest in human and animal motion, however, went far beyond his photography studio. He was a physician and inventor whose first invention, the `Sphygmograph' recorded human pulse beats. He also invented the `Kymograph' to measure the wing movements of bees; the 'Chronograph' to measure time intervals; and 'Marey's tambour', to measure subtle human and animal movement. [3] These investigations into biology and image creation provided inspiration for my work in several areas over the years.
The work of Marey, particularly that which uses photographic technology, was not clearly identified as either art or science because the work so clearly integrated both the imaging (or ‘art’) medium of photography with the scientific study of human and animal motion. Perhaps part of the reason for this comfortable fusion is because, at the time, photography had not yet been accepted widely as an art medium. Once photography became fixed as an art medium, focused on the reproduction of visual images, the sensory and disciplinary crossovers examined by Marey moved to the fringe area of art/science research.
Communicating Celestial and Human Oscillation
The biological movements that Marey recorded both visually and sonically often took the form of oscillations, like the beating of hearts and insect wings. Oscillating movement also exists on the scale of celestial bodies. For the planet Earth, earthquakes, Tsunamis, moving ice shelves and glaciers are forces of agitation and vibration. There are also wave oscillations that propogate in our Sun, particularly acoustic pressure waves. Helioseismology is an attempt to study the interior of the Sun using observations of the vibrations of its surface. [4] This term comes from a combination of three words:
- helios : from the classical Greek which means Sun or light.
- seismos : also from Greek meaning tremor.
- logos : meaning reasoning or discourse.
The complicated patterns created by the oscillations of the Sun can be mapped onto acoustic waves. Unlike earthquakes, which have a shear component (or s-waves) for which there is no sonic analog, the turbulence of solar waves "rings" the sun like a bell. For the Sun, no one source generates solar waves. Acting as a resonant cavity, the ringing Sun is like a bell struck continually with many tiny grains of sand.
The Retina Brun project took the live sounds of Helioseismology available through the GONG project [5] and transformed them through the movements of the eye. This complex interaction created was an attempt to create a new language for data interpretation. As individuals and groups are faced with the task of interpreting more and more information, a language or series of languages for communicating this mass of data needs to evolve. Teleconferencing and other networked interactions are inadequate means to communicate the complex subtleties of human interaction or our connection to the vast expanses of our universe. Through an effective sonification, data interpreted as sound has the potential to communicate emotional content or feeling, and I believe this emotional connection can increase the human understanding of the forces at work behind the data.
In a 1970 position statement on technology and composition to UNESCO Herbert Brun presented one of the more interesting discussions of the importance of process in art as a means toward the development of new languages of communication. Echoing the situationists, Brun called the process of new language development 'anticommunication.' However, he used the prefix 'anti' as in 'antithesis', not to mean 'against', but to mean 'juxtaposed' or 'from the other side.' He sees anticommunication as the offspring of communication, an attempt to say something through new modes rather than a refusal to say something, as may be defined as non-communication. One uses anticommunication as an active way of re-defining or re-creating the language. Brun calls this 'teaching the language.' [6] In our current global crisis of catastrophic climate change, species collapse and natural resource extinction, our language must evolve in every way possible. Anticommunication can be one experimental way in which to spark this evolutionary process. Scholar Brett Stalbaum sees the database not as a static subject on which an artist projects meaning, or even as a malleable piece of 'clay' transformed by an artist, but as a "catalyzing factor in the conversation." He optimistically states that "data and control systems provide a channel through which ecosystems are able to express an influence in favor of their own protection." [7] But in order for the expression of the data to be heard, we have to be listening.
References
1] Wade, Nicholas J. A Natural History of Vision, MIT Press, Cambridge, 1999
2] Lindberg, David C. Alhazen's Theory of Vision and Its Reception in the West. Isis. Vol. 58, No. 3, (Autumn, 1967)
3] Braun, Marta. Picturing Time: The Work of Etienne-Jules Marey 1830-1904 (Chicago, IL: University Of Chicago Press, reprint edition 1995).
4] Goldreich, P.; Keeley, D.A. (February 1977). "Solar seismology. II - The stochastic excitation of the solar p-modes by turbulent convection". Astrophysical Journal 212: 243–251.
5] Lindsey, C.; Braun, D.C.; Jefferies, S.M. (January 1993). "T.M. Brown",. ed. "Local Helioseismology of Subsurface Structure" in "GONG 1992. Seismic Investigation of the Sun and Stars". Astronomical Society of the Pacific Conference Series. 42. pp. 81
6] Brun, Herbert. "Technology and the Composer." As read to the United Nations Educational, Scientific, and Cultural Organization (UNESCO) Stockholm, 10 June 1970 <http://www.herbertbrun.org/techcomp.html>
7] Stalbaum, Brett. "Database Logic and Landscape Art" Netzpannung, January 24, 2004 <http://netzspannung.org/positions/lectures/mapping/>