nat (unit)

The natural unit of information (symbol: nat),[1] sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon.

One nat is equal to ⁠1/ln 2⁠ shannons ≈ 1.44 Sh or, equivalently, ⁠1/ln 10⁠ hartleys ≈ 0.434 Hart.

[1] Boulton and Wallace used the term nit in conjunction with minimum message length,[2] which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit used as a unit of luminance.

The International System of Units, by assigning the same unit (joule per kelvin) both to heat capacity and to thermodynamic entropy implicitly treats information entropy as a quantity of dimension one, with 1 nat = 1.

When the shannon entropy is written using a natural logarithm,

Units of information measurement.
Units of information measurement.