If a sample from a finite set A uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function where |A| denotes the cardinality of A.
If the base of the logarithm is 2, then the unit of uncertainty is the shannon (more commonly known as bit).
It is a special case of the Rényi entropy since: But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi, the Hartley function can be defined without introducing any notions of probability (see Uncertainty and information by George J. Klir, p. 423).
Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies Condition 1 says that the uncertainty of the Cartesian product of two finite sets A and B is the sum of uncertainties of A and B.
There is a unique integer s determined by Therefore, and On the other hand, by monotonicity, Using equation (1), one gets and Hence, Since t can be arbitrarily large, the difference on the left hand side of the above inequality must be zero, So, for some constant μ, which must be equal to 1 by the normalization property.