INDIGO - Interaction with Personality and Dialogue Enabled Robots - FP6

Homepage: http://www.ics.forth.gr/indigo/

Dates: from 1 Feb 2007 for 30 months

Coordinated by FORTH-ICS

WP4: Oberlander/Matheson/Isard

EU FP6 - Information Society Technologies - Cognitive Systems

The goal of INDIGO is to develop technology that will facilitate the advancement of human-robot interaction. This will be achieved both by enabling robots to perceive natural human behaviour as well as by making them act in ways that are familiar to humans. To achieve its goals, INDIGO will exploit and advance various enabling technologies. Additionally, it will focus on the integration of the involved technologies in a robotic platform. Specifically, INDIGO will employ mechanical heads capable of mimicking human facial expressions, and supporting naturalistic spoken conversation. The heads will be embodied on mobile robots empowered with advanced navigation skills. The overall systems will act according to motion patterns that are familiar to humans. Furthermore, advanced natural dialogue capabilities will facilitate human-robot interaction. Natural dialogue will involve input and output from various modalities, such as spoken natural language, gestures, emotions, and facial expressions. While the emphasis will be on technologies that will allow robots to generate natural descriptions of their physical surroundings, INDIGO will also address interpretation of a relatively broad range of input. Emphasis will be given in the creation of appropriate user models for humans interacting with a robot as well as for the robot itself. User models will be used to drive the dialogue management system and, thus, will allow adaptation in the behaviour of the robot according to the perceived user profile as well as the knowledge, personality and gathered experience of the robot itself. INDIGO will be demonstrated by deploying a prototype system in the premises of FHW. The prototype will operate autonomously, interacting with humans inexperienced in robots, offering them the possibility to engage with advanced robotics technologies. Moreover, the implemented technologies will pave the way towards the development of robots able to interact naturally and cooperate with humans.

JAST - Joint-Action Science and Technology - FP6

Homepage: http://www.euprojects-jast.net/

EU FP6 - Information Society Technologies - Cognitive Systems

Coordinated by TUM

Edinburgh people: Gurman-Bard/Carletta/Oberlander

Dates: 1 Oct 2004 - 30 Sep 2008

The success of the human species critically depends on our extra-ordinary ability to engage in joint action. Our perceptions, decisions and behavior are tuned to that of others with whom we share beliefs, intentions and goals and thus form a group. The main aim of the JAST project is to develop robots that are able to engage in joint action with each other or with a human through communicating and working intelligently on mutual tasks in dynamic unstructured environments. The JAST project studies joint action in a multidisciplinary framework involving neuroscientist, cognitive psychologists, roboticists, and psycholinguists. It exploits a prototypical research paradigm - a model construction task - to analyze human behavior and brain function during joint action (e.g., planning, acting, communicating, error monitoring). The behavioural and neuroscientific results provide insights and guidelines to support the development of the artificial, intelligently interacting agents. By extending the typical level of analysis of cognitive systems from single to multiple individuals acting together, JAST's R&D will impact the fields of artificial intelligence, cognitive and brain science. The project will also help to develop a new class of embodied artifacts, based on the concept of cognitive skill growth, to ensure that the functionality of future technologies includes inherent concepts of cooperative behavior.

-- MarkMcConville - 01 Sep 2008

Topic revision: r1 - 01 Sep 2008 - 10:46:02 - MarkMcConville
 
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback
This Wiki uses Cookies