A g-factor (also called g value) is a dimensionless quantity that characterizes the magnetic moment and angular momentum of an atom, a particle or the nucleus.
It is the ratio of the magnetic moment (or, equivalently, the gyromagnetic ratio) of a particle to that expected of a classical particle of the same charge and angular momentum.
In nuclear physics, the nuclear magneton replaces the classically expected magnetic moment (or gyromagnetic ratio) in the definition.
Protons, neutrons, nuclei, and other composite baryonic particles have magnetic moments arising from their spin (both the spin and magnetic moment may be zero, in which case the g-factor is undefined).
Conventionally, the associated g-factors are defined using the nuclear magneton, and thus implicitly using the proton's mass rather than the particle's mass as for a Dirac particle.
where μ is the magnetic moment of the nucleon or nucleus resulting from its spin, g is the effective g-factor, I is its spin angular momentum, μN is the nuclear magneton, e is the elementary charge, and mp is the proton rest mass.
where μs is the magnetic moment resulting from the spin of an electron, S is its spin angular momentum, and μB = eħ/2me is the Bohr magneton.
In atomic physics, the electron spin g-factor is often defined as the absolute value of ge:
are the eigenvalues of the Sz operator, meaning that ms can take on values
[2] The value gs is roughly equal to 2.002319 and is known to extraordinary precision – one part in 1013.
[3] The reason it is not precisely two is explained by quantum electrodynamics calculation of the anomalous magnetic dipole moment.
[4] Secondly, the electron orbital g-factor gL is defined by
where μL is the magnetic moment resulting from the orbital angular momentum of an electron, L is its orbital angular momentum, and μB is the Bohr magneton.
For an infinite-mass nucleus, the value of gL is exactly equal to one, by a quantum-mechanical argument analogous to the derivation of the classical magnetogyric ratio.
Thirdly, the Landé g-factor gJ is defined by
where μJ is the total magnetic moment resulting from both spin and orbital angular momentum of an electron, J = L + S is its total angular momentum, and μB is the Bohr magneton.
The value of gJ is related to gL and gs by a quantum-mechanical argument; see the article Landé g-factor.
μJ and J vectors are not collinear, so only their magnitudes can be compared.
The muon, like the electron, has a g-factor associated with its spin, given by the equation
where μ is the magnetic moment resulting from the muon's spin, S is the spin angular momentum, and mμ is the muon mass.
That the muon g-factor is not quite the same as the electron g-factor is mostly explained by quantum electrodynamics and its calculation of the anomalous magnetic dipole moment.
Almost all of the small difference between the two values (99.96% of it) is due to a well-understood lack of heavy-particle diagrams contributing to the probability for emission of a photon representing the magnetic dipole field, which are present for muons, but not electrons, in QED theory.
However, not all of the difference between the g-factors for electrons and muons is exactly explained by the Standard Model.
The muon g-factor can, in theory, be affected by physics beyond the Standard Model, so it has been measured very precisely, in particular at the Brookhaven National Laboratory.
In the E821 collaboration final report in November 2006, the experimental measured value is 2.0023318416(13), compared to the theoretical prediction of 2.00233183620(86).
[6] This is a difference of 3.4 standard deviations, suggesting that beyond-the-Standard-Model physics may be a contributory factor.
[7] When the Brookhaven and Fermilab measurements are combined, the new world average differs from the theory prediction by 4.2 standard deviations.
The electron g-factor is one of the most precisely measured values in physics.