University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Online learning for Tracking-by-Detection using P/N-Constraints

Online learning for Tracking-by-Detection using P/N-Constraints

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Leandro Minku.

A new Tracking-by-Detection method for general objects visible in a single video stream appeared recently in the Computer Vision community. Given an initial sample of a particular object, the method is able to learn on-line a detector of the underlying object’s appearance while simultaneously tracking the object frame by frame. The detector shows promising performance in terms of computation time, precision and recall. This talk will focus on the semi-supervised principle used in this method to learn on-line the object detector given so-called Positive (P) and Negative (N) constraints. The former P-constraint allows to label unlabeled samples of the object’s appearance while the latter N-constraint enables reliable pruning of false positives. It has been shown empirically and under certain assumptions to a certain extent analytically that the error canceling property of the N-constraint prevents the learning process of drifting away over time from the underlying set of possible object appearances. At the end of this talk we would like to critically discuss with the audience the novelty of this learning principle seen from the viewpoint of Machine Learning experts and to potentially identify promising existing or even new learning mechanism in the paradigm of Tracking-by-Detection for multiple camera views.

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on from the University of Cambridge.