[1][2][3][4][5] It is one of the central quantities used to qualify the utility of an input state, especially in Mach–Zehnder (or, equivalently, Ramsey) interferometer-based phase or parameter estimation.
When the observable generates a unitary transformation of the system with a parameter
, the quantum Fisher information constrains the achievable precision in statistical estimation of the parameter
is the symmetric logarithmic derivative For a unitary encoding operation
For non-invertible density matrices, the inverse above is substituted by the Moore-Penrose pseudoinverse.
Alternatively, one can compute the quantum Fisher information for invertible state
, because the imaginary part leads to an antisymmetric contribution that disappears under the sum.
of the density matrix potentially depend on the vector of parameters
The extra term (which is however zero in most applications) can be avoided by taking a symmetric expansion of fidelity,[11] For
and unitary encoding, the quantum Fisher information matrix reduces to the original definition.
, four times the quantum Fisher information of this state is called fidelity susceptibility, and denoted[13] Fidelity susceptibility measures the sensitivity of the ground state to the parameter, and its divergence indicates a quantum phase transition.
This is because of the aforementioned connection with fidelity: a diverging quantum Fisher information means that
, the quantum Fisher information is convex: The quantum Fisher information is the largest function that is convex and that equals four times the variance for pure states.
That is, it equals four times the convex roof of the variance[14][15] where the infimum is over all decompositions of the density matrix Note that
is a symmetric separable state and Later the above statement has been proved even for the case of a minimization over general (not necessarily symmetric) separable states.
is the classical Fisher information associated to the probabilities contributing to the convex decomposition.
The first term, in the right hand side of the above inequality, can be considered as the average quantum Fisher information of the density matrices in the convex decomposition.
We need to understand the behavior of quantum Fisher information in composite system in order to study quantum metrology of many-particle systems.
[21][22] It is possible to obtain a weaker but simpler bound [23] Hence, a lower bound on the entanglement depth is obtained as A related concept is the quantum metrological gain, which for a given Hamiltonian is defined as the ratio of the quantum Fisher information of a state and the maximum of the quantum Fisher information for the same Hamiltonian for separable states
The metrological gain is defined by an optimization over all local Hamiltonians as
The error propagation formula gives a lower bound on the quantum Fisher information where
This formula can be used to put a lower on the quantum Fisher information from experimental results.
equals the symmetric logarithmic derivative then the inequality is saturated.
[25] For the case of unitary dynamics, the quantum Fisher information is the convex roof of the variance.
[26] There are numerical methods that provide an optimal lower bound for the quantum Fisher information based on the expectation values for some operators, using the theory of Legendre transforms and not semidefinite programming.
So far, we discussed bounding the quantum Fisher information for a unitary dynamics.
It is also possible to bound the quantum Fisher information for the more general, non-unitary dynamics.
For systems in thermal equibirum, the quantum Fisher information can be obtained from the dynamic susceptibility.
That is, there is a decomposition for which the second inequality is saturated, which is the same as stating that the quantum Fisher information is the convex roof of the variance over four, discussed above.
There is also a decomposition for which the first inequality is saturated, which means that the variance is its own concave roof [14] Knowing that the quantum Fisher information is the convex roof of the variance times four, we obtain the relation [31]