University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Robotic hand-eye coordination without global reference: A biologically inspired learning scheme

Robotic hand-eye coordination without global reference: A biologically inspired learning scheme

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Per Kristian Lehre.

Understanding the mechanism mediating the change from inaccurate pre-reaching to accurate reaching in infants may confer advantage from both a robotic and biological research perspective. In this work, we present a biologically meaningful learning scheme applied to the coordination between reach and gaze within a robotic structure. The system is model-free and does not utilize a global reference system. The integration of reach and gaze emerges from the learned cross-modal mapping between reach and vision space as it occurs during the robot-environment interaction. The scheme showed high learning speed and plasticity compared with other approaches due to the low level of training data required. We discuss our findings with respect to biological plausibility and from an engineering perspective, with emphasis on autonomous learning and re-learning.

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.