Digital technology brings together specialized equipment or hardware, software programmes, and communication networks. This technology can be applied in puppetry. Manipulating a puppet or an object has always been a technical undertaking depending upon the type of puppet and the context in which it is used. In this sense, digital technology is understood and used by puppeteers as a manipulation technique. Through an electronic interface, such as a remote controller or a sensor, it connects them to a puppet and facilitates movement and animation.
In the case of Jeffrey Shaw’s installation Configuring the Cave (1997), the puppet itself constituted the interface. This virtual environment was comprised of high-resolution images, generated on a computer in real time and projected onto the three walls and the floor of a “cave”. As a result, the spectator found himself totally immersed in a three-dimensional virtual environment that was accentuated by interactive, sonic signals. These signals are generated through a wood puppet found in the middle of the space. Visitors were invited to play with it, freely moving its limbs and head. In doing so the “interactor” could summon one virtual world or another out of the seven accessible through the programme and explore how the images and sound reacted to further manipulation.
Electronic commands can also be used to remotely control a puppet from a distance. One early example was Theater of the Ears. This production directed by Zaven Paré and Allen S. Weiss in Los Angeles in 1999 incorporated electronic puppets made by Mark Sussman and the voice of Gregory Whitehead.
Virtual Avatars and Capturing Movement
A puppeteer can use electronic sensors to capture his/her own movements. These can then be used to manipulate a puppet through electronic controls. In these cases, the puppet is called a “virtual avatar”. For example, virtual puppets destined for the stage have been manipulated through the use of electronic gloves that have sensors inserted into the gloves’ fabric or fingers. Real time capture of the fingers’ movements then animate the virtual puppet. This form of manipulation requires fairly intense collaboration between puppet companies and programmers. A notable historical example which began in 1996 was the association between the group Animação and the Institut de Recherche en Informatique de Toulouse (IRIT, Toulouse Institute of Computer Science Research). Their early work involved the use of avatars conceived as virtual puppets. These were “manipulated” from a distance with sensors similar to “Polhemus” or “haptic gloves”. A manipulator, equipped with sensors, could either be hidden or play in full-view in front of a small emitter-receptor case. This case relayed signals to a computer which interpreted the position and orientation of the sensors. Three-dimensional characters were then generated and projected, reflecting the puppeteer’s movements.
Another early example, La Funambule (The Tightrope), was an interactive installation by Michel Bret and Marie-Hélène Tramus, researchers and teachers at Art et Technologie de l’Image (AIT, Art and Image Technology) in Paris. The tightrope walker in the projection extended down toward the centre of the screen and to a line on the floor, inviting visitors to step onto the tightrope. Any movement by the participant would be reflected on screen, moving the tightrope and in turn affecting a projected tightrope walker. The programming represented a simple form of artificial intelligence through which the projected character “learned” a plethora of responses based on interactions with participants. The artists proposed that this constituted a dance between two people, endowing the piece with a more symbolic nature representing the action-reaction of human relationships.
The Increased “Stage”
Digital technology has also been used to increase the notions of “performance space” and audience. Such was the case in the performance Pac Manhattan (2004), when five actors in the streets of New York recreating the PacMan video game using mobile phones and Wi-Fi connections for an audience of Internet users. The “performance” Can You See Me Now? by the English collective Blast Theory (2001) had a similar premise. This hybrid game took place in Sheffield, simultaneously in the streets and on the screens of the participating audience, which displayed the avatars of the players around the city. The space in which these performances or games take place is said to be “augmented”, representing the intersection of physical space and cyberspace. The technologies of motion capture and the mapping of images can also permit the integration of a puppet into a virtual scene, as in Cyberpunch (2003), an experiment conducted by Thomas Vogel’s theatre group in Berlin. The puppets moved from the stage to the screen, then to a virtual space. Through real-time 3D animation on stage, an interactive link between virtual puppets and real ones was established.
Stage and Screen
In essence, a scenic virtual space serves as the stage where avatars can be manipulated as virtual puppets. Examples on the Internet include metapet.net created by Natalie Bookchin, where the player can control a virtual genetic creature. Visitors can also change the parameters of two-dimensional and three-dimensional zoomorphic creatures on the site <http:/sodaplay.com>. Meanwhile, the “Puppet Tools” at Frédéric Durieu’s experimental zoo, <http:/lecielestbleu.com>, can be used to manipulate a range of animals. In a similar vein, Antoine Schmitt has made generative and moveable pixels that can be manipulated on <www.gratin.org>. These are just a few instances in which the Internet offers another stage for the puppet where the screen is transformed into a stage.
The application of digital technology has encouraged any number of experiments into the relationship between manipulator, manipulated object and old and new notions of audience. Phototropy (1994), the interactive installation of Laurent Mignonneau and Christa Sommerer, explored this interdependence through the life and development of a virtual plant that depended on the orientation of a lamp activated by the visitor. In Catherine Ikam’s Elle the face of a synthetic character reacted, through a laser sensor, to the presence and position of visitors inside the installation. In 2002, Denis Marleau’s Les Aveugles (The Blind), which was based on the play by Maurice Maeterlinck, combined virtual masks and voice-overs to explore Maeterlinck’s notion of a theatre in which living actors would be obsolete.