Thursday, June 12, 2014

Geodesic Convexity and Covariance Estimation, Medians and means in Riemannian geometry, MaxEnt'14

Following up on this morning's entry, Mike Davies sent me the following:

Hi Igor 
Just saw your recent post mentioning the use of log det X function as a nonconvex alternative to nuclear norm. This reminded me of a recent talk I saw by Ami Weisel on geodesic convexity.
It turns out that if X is positive semidefinite then log det X is geodesically convex. My understanding is that this guarantees the uniqueness of the minimization process. So possibly not as nonconvex as first appears... 
All the best
Mike
Thank you Mike !


the attendant paper is: Geodesic Convexity and Covariance Estimation by Ami Weisel

Abstract—Geodesic convexity is a generalization of classical convexity which guarantees that all local minima of g-convex functions are globally optimal. We consider g-convex functions with postive definite matrix variables, and prove that Kronecker products, and logarithms of determinants are g-convex. We apply these results to two modern covariance estimation problems: robust estimation in scaled Gaussian distributions, and Kronecker structured models. Maximum likelihood estimation in these settings involves non-convex minimizations. We show that these problems are in fact g-convex. This leads to straight forward analysis, allows the use of standard optimization methods and paves the road to various extensions via additional g-convex regularization.
This reminded me that I am long overdue with a blog entry with an exchange I had with Frederic Barbaresco a while ago. Which leads me to wonder if the median for PSDs is also g-convex.






This paper is a short summary of our recent work on the medians and means of probability measures in Riemannian manifolds. Firstly, the existence and uniqueness results of local medians are given. In order to compute medians in practical cases, we propose a subgradient algorithm and prove its convergence. After that, Fr\'echet medians are considered. We prove their statistical consistency and give some quantitative estimations of their robustness with the aid of upper curvature bounds. We also show that, in compact Riemannian manifolds, the Fr\'echet medians of generic data points are always unique. Stochastic and deterministic algorithms are proposed for computing Riemannian p-means. The rate of convergence and error estimates of these algorithms are also obtained. Finally, we apply the medians and the Riemannian geometry of Toeplitz covariance matrices to radar target detection.

Frederic just recently to me that Gromov, Bennequin and Balian will be at Maxent’14 and that Geometric Theory of Information just came out.


2 comments:

Anonymous said...

The foundation of all these geometries is given by hessian geometry of Jean-Louis Koszul. The important concept is the Koszul-Vinberg Characteristic function. By selecting the inner product deduced from Cartan-Killing form, we recover the good geometry of covariance matrix.
This idea has been extended by Jean-Marie Souriau in statistical physics, where the metric is introduced through symplectic cocycle, proving that the metric is invariant by dynamic groups in physics.
Entropy could be introduced by Legendre transform of minus the logarithm of Koszul-Vinberg characteristc function.
I have recently developed these Koszul and Souriau works in a long paper published in Springer book "geometric theory of information" edited by Frank Nielsen:
http://link.springer.com/chapter/10.1007/978-3-319-05317-2_7
This paper also deals with quantum Fisher metric introduced by Roger Balian in 1986.
Another paper Will be published soon in MDPI journal "Entropy" with title "Koszul Information Geometry and Geometric Temperature/Capacity of Souriau Lie Group Thermodynamics".
For MaxEnt'14, a special issue Will be published in MDPI Journal "entropy": Entropy, Information and their Geometric Structures
http://www.mdpi.com/journal/entropy/special_issues/entropy-Geome

Frédéric Barbaresco

Anonymous said...

The foundation of all these geometries is given by hessian geometry of Jean-Louis Koszul. The important concept is the Koszul-Vinberg Characteristic function. By selecting the inner product deduced from Cartan-Killing form, we recover the good geometry of covariance matrix.
This idea has been extended by Jean-Marie Souriau in statistical physics, where the metric is introduced through symplectic cocycle, proving that the metric is invariant by dynamic groups in physics.
Entropy could be introduced by Legendre transform of minus the logarithm of Koszul-Vinberg characteristc function.
I have recently developed these Koszul and Souriau works in a long paper published in Springer book "geometric theory of information" edited by Frank Nielsen:
http://link.springer.com/chapter/10.1007/978-3-319-05317-2_7
This paper also deals with quantum Fisher metric introduced by Roger Balian in 1986.
Another paper Will be published soon in MDPI journal "Entropy" with title "Koszul Information Geometry and Geometric Temperature/Capacity of Souriau Lie Group Thermodynamics".
For MaxEnt'14, a special issue Will be published in MDPI Journal "entropy": Entropy, Information and their Geometric Structures
http://www.mdpi.com/journal/entropy/special_issues/entropy-Geome

Frédéric Barbaresco

Printfriendly