University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Biased mobile robots? Characterizing and addressing issues of fairness in mobile robotics

Biased mobile robots? Characterizing and addressing issues of fairness in mobile robotics

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hector Basevi.

Host: Dr Iran Mansouri (

Abstract: Machine learning has recently been the object of various criticism regarding issues of bias, fairness and discrimination. From criminal recidivism prediction, to facial analysis and natural-language-processing, many have been the examples of algorithms that perform worse on ethnic minorities, or otherwise reproduce social stereotypes. In this talk I will argue that similar issues of bias and fairness are present in the design of algorithms for robotics. I will focus on two case studies in mobile robotics: 1) pedestrian detection and 2) navigation/path planning. 1) Based on the analysis of state-of-the-art pedestrian algorithms I will show that there are currently some pedestrians that are more likely to be victims of collisions with robots. I will show the sources of this bias are many and relate to a complex socio-technical reality. 2) Through the use case of a rescue robot I will argue that robot navigation paths can indirectly discriminate, and I will formally define a few notions of fairness that could be used in motion planners directly. I will use these to characterize fairness, and to show the existence of counter-productive fairness definitions. I will conclude with some general insights for how we can correcly consider fairness when designing robots.


This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on from the University of Cambridge.