Solution of an old problem in vector optimisation: Support function of the generalised Jacobian
If you have a question about this talk, please contact Paul Taylor.
Many optimisation problems are non-smooth in the sense that the objective function is not differentiable. This is invariably the case in many areas of application when the objective function contains simple constructs such as maximum or minimum of several differentiable functions or contains, for example, the absolute value function. The notion of the Clarke sub-gradient of locally Lipschitz maps has been used successfully in the past decades to solve non-smooth non-convex optimisation problems by methods such as the sub-gradient descent or the more powerful bundle method.
The generalised Jacobian of a locally Lipschitz vector function, as introduced by Clarke in mid 1970’s , plays the same role in non-smooth vector (multi-objective) optimisation that the (Clarke) sub-gradient or sub-differential plays in the usual non-smooth scalar optimisation. The sub-gradient of a non-smooth objective function is a non-empty, compact and convex subset of the finite Euclidean space of the same dimension as the domain of the objective function. Similarly, the generalised Jacobian is a non-empty, compact and convex subset of the space of real mxn matrices where n is the dimension of input and m is the dimension of the output of the multi-objective function. However, until now, there has been a huge discrepancy in the way they have been defined. The sub-gradient of a scalar function at a point was originally defined constructively by its support function which provides a simple expression for its boundary. In contrast, the generalised Jacobian has been defined non-constructively by using a theorem of Rademacher in analysis which states that every Lipschitz map between finite dimensional Euclidean spaces is differentiable for almost all its arguments.
The non-constructive nature of this definition has created a major obstacle in the application of the generalised Jacobian in vector optimisation and several other fields of computation such as non-smooth Newton method or solution of parametric ODE ’s with non-smooth right hand side. There have been two attempts to tackle the problem of finding the support function of the generalised Jacobian. Hiriart-Urruty  derived the support function of an approximation to the generalised Jacobian, namely its so-called plenary hull. Imbert  used a Green-Stokes formula to obtain an intractable analytic expression for the support function of the generalized Jacobian that involves a limsup operation over surface integrals of the divergence of the derived function on a shrinking sequence of hypercubes.
In this talk, I will derive a simple expression to compute the support function of the generalised Jacobian after 40 years. It determines the boundary of the non-empty compact convex set that the generalised Jacobian represents in the space of mxn matrices. The derivation is elementary, does not invoke Rademacher’s theorem and is in the same spirit as that of the sub-gradient of a scalar objective function. Moreover, I will show that the same technique can be used to compute the generalised Jacobian of locally Lipschitz multi-objective functions defined on an infinite dimensional Banach space, which allows optimisation over function spaces.
 F. H. Clarke, Optimization and Nonsmooth Analysis, Society for Industrial and Applied Mathematics (SIAM), 1990.
 J.-B. Hiriart-Urruty, Characterizations of the plenary hull of the generalized Jacobian matrix, Mathematical Programming Study, 17 (198 2) 1-12
 C. Imbert, Support functions of the Clarke generalized Jacobian and of its plenary hull, Nonlinear Analysis: Theory, Methods & Applicat ions archive Volume 49 Issue 8, July 2002 Pages 1111-1125
This talk is part of the Theoretical computer science seminar series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsCond. Mat. seminar Cargo Algebra seminar
Other talksLong-Term Autonomy in Everyday Environments: A New Challenge for AI and Robotics TBA TBA Low-frequency inertial sensing Open slot The 2017 Haworth Lecture