Information geometry

Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian metric.

[2][3] The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field.

[5] All presented above geometric structures find application in information theory and machine learning.

In this case, the manifold naturally inherits two flat affine connections, as well as a canonical Bregman divergence.

The results combine techniques from information theory, affine differential geometry, convex analysis and many other fields.

The set of all normal distributions forms a statistical manifold with hyperbolic geometry .