University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > You like to move it? Pose Estimation and Tracking for Collaborative Robotics and Medical Sensor Fusion

You like to move it? Pose Estimation and Tracking for Collaborative Robotics and Medical Sensor Fusion

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hector Basevi.

Host: Prof. Ales Leonardis (a.leonardis@cs.bham.ac.uk)

Abstract: An adequate understanding of the 3D surrounding is crucial to realize seamless human-robot interaction and to spatially fuse sensor data. Modern vision systems aim to enable robotic collaboration and help to combine input from multiple modalities. These systems require robust, flexible, and reliable interpretation of the geometry in sight while real-time pose computation is essential.

We describe commonly used tracking approaches in the medical field and discuss their advantages and pitfalls with regard to sensor fusion and 3D data extraction. While most clinically used optical tracking systems rely on rigid body markers, we analyse alternative solutions and show their practical benefits starting from flexible marker setups to fully markerless tracking overcoming projection-induced object ambiguities.

Website: http://campar.in.tum.de/Main/BenjaminBusam

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.