|The PROSTHETIC HEAD is an embodied conversational agent that speaks to the person who interrogates it. It has a data base, real-time lip syncing and a library of facial expressions. Coupled to a person who interrogates it, there is the possibility of interesting verbal exchanges. It was conceived of as a conversational system rather than as an intelligent agent, but given it’s effective spoken responses and appropriate facial expressions, the agent generates an adequate aliveness that makes problematic issues of identity, awareness and intelligence. It is a 3000 polygon mesh wrapped with a skin of the artist’s face. The Prosthetic Head has been exhibited as a 5m high projection. Although its scale gives it an impressive presence, it has always been a 2D image, with limited interaction. With the ARTICULATED HEAD, the agent is given a physical and sculptural embodiment. The Head is displayed on an LCD screen that is attached at the end of a 6 degree-of-freedom industrial robot arm, giving it an articulated neck. The virtual behaviour of the agent is now augmented by the physical motion of the robot system generating a more seductive aesthetic experience. We can now more effectively test the sound location and visual tracking of the system that is part of an attention model that allows the Head to interact with more than one person at a time.
The Articulated Head is part of the THINKING HEAD project. A 5 year research project led by the MARCS Auditory Labs, University of Western Sydney. The Thinking Head project is one of the 3 Thinking Systems Initiatives, jointly funded by the Australian Research Council (ARC) and the Australian National Health and Medical Research Council (NH&MRC).