Hartley EntropyEdit
Hartley entropy is a historically important measure of information content that arises when the outcomes of a process are equally likely. Named after Ralph Hartley, it captures the idea that the amount of information needed to specify one outcome grows with the number of equiprobable states. In its simplest form, if a system has N equally probable states, its Hartley entropy is log10(N), and the unit is the hartley. Because log10 is decimal, the hartley is a natural unit for situations where decimal counting is convenient; in practice, one hartley equals about 3.32193 bits. The concept sits at the intersection of early information theory and practical coding, and it remains a useful reference point for understanding how information scales in uniform-choice problems. information theory and entropy are the broader fields that give Hartley entropy its place in the landscape of measurement.
Hartley entropy is best understood as the decimal-counterpart to the more general information entropy developed later by Claude Shannon. It applies precisely to situations with uniform probabilities: if each of N outcomes is equally likely, H_hartley = log10(N) hartleys. For non-uniform distributions, the broader framework of Shannon entropy is required, since H_S = - sum_i p_i log_b(p_i) captures how probabilities concentrate information. In that sense, Hartley entropy is the special case of the general theory when all outcomes carry the same likelihood. The base of the logarithm determines the unit: base 10 yields hartleys, base 2 yields bits, and base e yields nats. This relationship is commonly summarized as H_bits = H_hartleys × log2(10) ≈ H_hartleys × 3.32193. logarithm is the mathematical operation that underpins these conversions, and decimal is the practical numeral system that makes hartleys intuitive in everyday counting.
Definition and units - For N equiprobable outcomes: H_hartley = log10(N) hartleys. - Conversion to bits: H_bits = log2(N) = log10(N) × log2(10) ≈ H_hartleys × 3.32193. - The hartley is the decimal analogue of the bit and is named to emphasize decimal-digit counting, in contrast to the binary framing that dominates much of computer science today. The unit and the base-choice are mathematical conveniences, but they carry with them different practical emphases depending on the context of measurement. See also Hartley (unit).
Context, interpretation, and limitations - Hartley entropy is additive for independent, equiprobable factors: log10(N1N2) = log10(N1) + log10(N2). This makes it particularly transparent for problems where the total state space factors into several independent decimal-like choices (for example, a code consisting of several decimal digits with ten possibilities per digit). In such cases, the total information content simply sums across digits. - For real-world data where outcomes are not equally likely, Hartley entropy does not provide the full story. Shannon entropy generalizes the concept to capture the actual distribution of outcomes, giving a tighter and more robust measure of uncertainty. In practice, engineers and scientists rely on Shannon entropy when probabilities vary, while Hartley entropy remains a clean and intuitive reference point for uniform-state problems. - In the broader tradition of information theory, the Hartley measure serves as a bridge between mathematics and common-sense counting. The decimal basis makes it easy to think about “how many decimal choices are involved” and to translate that into a readable sense of information content. By contrast, the binary basis aligns with digital storage and transmission, which is why bits dominate modern engineering discussions. See Shannon entropy and bit.
Historical development and influence - Ralph Hartley introduced the idea of information content tied to the number of possible states in the late 1920s, laying groundwork that would influence subsequent developments in information theory and communications. His attention to a decimal-logarithmic measure helped popularize a way of thinking about information that was accessible to practitioners of telecommunication and data coding. For biographical and historical context, see Ralph Hartley. - The later, more general framework developed by Claude Shannon expanded beyond uniform-state situations and established a universal theory of information that encompasses a wide range of probabilistic sources. Hartley entropy remains an instructive special case within that broader theory, illustrating how simple counting translates into a quantitative measure of information.
Controversies and debates - The practical relevance of Hartley entropy has declined in the face of Shannon entropy, which handles non-uniform distributions and noisy channels with greater generality. Critics argue that focusing on base-10 logarithms and the hartley unit is mainly of historical or pedagogical interest, not of day-to-day engineering practice. Proponents counter that the decimal perspective remains valuable for problems that are naturally expressed in decimal-digit terms, and it preserves intuition about how information scales with the size of the state space. - In broader policy and public discourse, some critics on the left argue that quantifying information and data in precise units can be used to justify technocratic approaches to social problems. From a conservative-leaning standpoint that prizes clarity and efficiency, those critiques can be seen as overreach: information theory is a mathematical toolkit, not a social program. The central point for adherents of this view is that Hartley entropy, like any measurement, should be judged on predictive power and practical utility rather than on ideological uses. Critics of “woke” critiques argue that dismissing established mathematical tools as political overreach conflates technique with ideology and can hinder technical progress. In this view, Hartley entropy is appreciated for its historical importance and its clear, debate-free math when applied to uniform-state situations.
See also - Ralph Hartley - Hartley (unit) - information theory - entropy - Shannon entropy - bit