University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Unsupervised sensorimotor integration for task learning in robot manipulators.

Unsupervised sensorimotor integration for task learning in robot manipulators.

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hector Basevi.

Host: Dr Claudio Zito

Abstract: Past research has shown it is possible for robot systems to learn to integrate unlabelled sensory data into self-organised multisensory features that correspond to higher-level structural correspondences in the task space. In this talk we discuss the design of a similar system that directly learns high-level features from robot sensory data. Inspired by a denoising autoencoder architecture, we train a deep neural network to regenerate sensory or motor sequences given partial input with dropped-out modalities. In addition to the benefits that multisensory integration affords to generalisation capacity and sensing error, these mappings can be applied to trajectory generation for desired sensory sequences, or sensory anticipation / visual servoing with respect to desired trajectories.

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.