Logo der Uni Stuttgart
Mutual Information Calculation

The mutual information of two jointly random variables, where X is discrete and Y is continuous, is calculated as: $$ I(Y;X) = I(X;Y) = \sum_{X} \int_{Y} P(X,Y) \cdot \log_2 \left(\frac{P(X,Y)}{P(X) \cdot P(Y)}\right) dy $$ $$ \Longleftrightarrow \sum_{X} \int_{Y} P(Y\mathrel{\mskip-2mu|\mskip-2mu}X) \cdot P(X) \cdot \log_2 \left(\frac{P(Y\mathrel{\mskip-2mu|\mskip-2mu}X)}{P(Y)}\right) dy $$ Assuming a modulation scheme with N different symbols, the equation can be rewritten as: $$ I(X;Y) = E \left[\sum_{X} P(X) \cdot \log_2 \left(\frac{P(Y\mathrel{\mskip-2mu|\mskip-2mu}X)}{\sum_{X} P(X) \cdot P(Y\mathrel{\mskip-2mu|\mskip-2mu}X)}\right)\right] $$

The conditional probability \(P(Y|X)\) is defined as: $$ P(X\mathrel{\mskip-2mu|\mskip-2mu}Y) = \frac{1}{\sqrt{2\pi\sigma^2}} \cdot e^{-\frac{1}{2\sigma^2} \left\|X-Y\right\|^2} $$

The expected value of the MI can be obtained through the use of Monte Carlo simulations.