University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Learning a high dimensional structured matrix and multi-task learning

Learning a high dimensional structured matrix and multi-task learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Per Kristian Lehre.

Note unusual time

This talk presents the problem of learning a high dimensional matrix from noisy linear measurements. A main motivating application for this study is multi-task learning, in which the matrix columns corresponds to different regression or binary classification tasks. Our learning method consists in solving an optimization problem which involves a data term and a penalty term. We will discuss three families of penalty terms: quadratic, structured-sparse and spectral. They implement different types of matrix structure. For example, the quadratic penalty may encourage certain linear relationships across the tasks, the structured-sparse penalty may favor tasks which share similar sparsity patterns, and the spectral penalty may favor low rank matrices. We will present an efficient algorithm for solving the optimization problem, and report on numerical experiments comparing the different methods. Finally we will discuss how these ideas can be extended to learn non-linear task functions by means of reproducing kernel Hilbert spaces.

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.