Fisher's information (FIM), named after Ronald Fisher, (1925) is another kind of measure, in two respects, namely, 1) it reflects the amount of (positive) information of the observer, 2) it depends not only on the PD but also on its first derivatives, a property that makes it a local quantity (Shannon's is instead a global one).
The corresponding counterpart of MaxEnt is now the FIM-minimization, since Fisher's measure grows when Shannon's diminishes, and vice versa.
Much effort has been devoted to Fisher's information measure, shedding much light upon the manifold physical applications.
[1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][excessive citations] As a small sample, it can be shown that the whole field of thermodynamics (both equilibrium and non-equilibrium) can be derived from the MFI approach.
[17] More recently, Zipf's law has been shown to arise as the variational solution of the MFI when scale invariance is introduced in the measure, leading for the first time an explanation of this regularity from first principles.