# Combinatorial Entropies and Statistics - Condensed Matter > Statistical Mechanics

Combinatorial Entropies and Statistics - Condensed Matter > Statistical Mechanics - Download this document for free, or read online. Document in PDF available to download.

Abstract: We examine the {combinatorial} or {probabilistic} definition -Boltzmann-sprinciple- of the entropy or cross-entropy function $H \propto \ln \mathbb{W}$or $D \propto - \ln \mathbb{P}$, where $\mathbb{W}$ is the statistical weightand $\mathbb{P}$ the probability of a given realization of a system.Extremisation of $H$ or $D$, subject to any constraints, thus selects the -mostprobable- MaxProb realization. If the system is multinomial, $D$ convergesasymptotically for number of entities $N \back \to \back \infty$ to theKullback-Leibler cross-entropy $D {KL}$; for equiprobable categories in asystem, $H$ converges to the Shannon entropy $H {Sh}$. However, in many cases$\mathbb{W}$ or $\mathbb{P}$ is not multinomial and-or does not satisfy anasymptotic limit. Such systems cannot meaningfully be analysed with $D {KL}$ or$H {Sh}$, but can be analysed directly by MaxProb. This study reviews severalexamples, including a non-asymptotic systems; b systems withindistinguishable entities quantum statistics; c systems withindistinguishable categories; d systems represented by urn models, such as-neither independent nor identically distributed- ninid sampling; and esystems representable in graphical form, such as decision trees and networks.Boltzmann-s combinatorial definition of entropy is shown to be of greaterimportance for {-probabilistic inference-} than the axiomatic definition usedin information theory.

Author: ** Robert K. Niven**

Source: https://arxiv.org/