Logo der Uni Stuttgart
Mutual Information and Channel Capacity

The mutual information \(I(X;Y)\) (in German referred to as "Transinformation", or "wechselseitige Information") is a measure for the mutual dependence between two or more variables. More specifically, it quantifies the amount of information obtained about one random variable by observing another random variable. In an AWGN channel, the channel capacity, which is the rate at which information can reliably be transmitted, is defined as follows: $$ C = \underset{x \in P(X)}{\max} I (X;Y) $$

The Shannon-Hartley theorem defines the theoretical upper limit of the channel capacity when transmitting with an average received signal power \(S\) and a bandwidth \(B\) over a channel subject to additive white Gaussian noise with a power of \(N\), referred to as the Shannon limit or Shannon capacity: $$ C = B \cdot \log_2(1 + \frac{E_\text{s}}{N_0}) $$

Modulation schemes like quadrature amplitude modulation (QAM) and phase shift keying (PSK) are commonly used in digital communication to transfer data. However, these schemes often fall short in achieving MI values that near the Shannon limit. Specifically, the QAM exhibits an asymptotic shaping-loss of approximately 1.53 dB with increasing signal to-noise-ratio (SNR) compared to the additive white Gaussian noise Shannon capacity. Paper [1] presents two distinct designs of a discrete modulation scheme, the golden angle modulation (GAM), that revolves around geometric and probabilistic shaping techniques to offer near capacity achieving values of MI.

[1] P. Larsson, Golden angle modulation, IEEE Wireless Communications Letters, vol. 7, no. 1, February 2018, Student Member, IEEE.