Petting Zoo Prototype
Petting Zoo is an synthetic robotic environment of animalistic creatures. Designed as a real-time augmented behavioral environment the installation exhibits life like attributes through forms of conversational interaction establishing communication with users that is emotive and sensorial. Conceived as an immersive installation environment, social and synthetic forms of systemic interactions allow the pets to engage and evolve their behaviors over time. Pets interact and stimulate participation with users through the use of animate behaviors communicated through kinesis, sound and illumination. These behaviors evolve over time through interaction enabling each pet to develop personalities. Pet interactions are stimulated through interaction with human users or between other pets within the population. Each pet constructs a framework pattern behaviors that act as memory and enable each pet to learn evolve over time.
Early experiments that examine similar issues can be found in the seminal cybernetic work of the Senster developed by the British cybernetic sculptor, Edward Ihnatowicz, Gordon Pask's The Colloquy of Mobiles, and Walter Gray Walter's first electronic autonomous robots (Tortoises) called Elmer and Elsie. Petting Zoo continues our ongoing research with real-time sensing and response that can be found in other works of ours like Becoming Animal exhibited in MoMA's Talk To Me show and Memory Cloud: Detroit (2011) ICA London (2008).
Petting Zoo maximizes and exploits both input and output parameters, creating a dynamic feedback through direct and particular forms of communication. Intimacy and curiosity are explored as enabling agents that will seek to externalize personal experience through forms of direct visual, haptic and aural communication. Through acknowledgement and externalization, opportunities for collective and shared experiences emerge. The pets are developed to be able to learn and respond to their environment. Rather than being reactive sculptures they will adapt and sense pattern that will form the basis of their respective behaviors. The proposal would develops three pets to be installed at the FRAC Centre, each exhibiting features and personalities that will be unique and evolving. Through acknowledgement and externalization, opportunities for collective and shared experiences emerge.
Physical Sculpture – autonomous movement, user and multiuser interaction.
Inputs including location, touch, gesture based movements, multiuser interactions. Various outputs including sound, vibration, color, luminosity.
Kinect - real-time data mapping
Awareness of participant(s) will be enabled through a camera tracking / data scanning setup in response to contextual and environmental parameters. Live image streams are real-time processed and coupled with blob tracking and optical flow analysis to locate local positions and gestural activity of participants (crowd). Camera tracking will allow for consistent daylight and night time scenarios as well as detailed information of gestural responses.