Mutual Information and Log-likelihood ratio

The relationship between the information content and the log-likelihood ratio can be summarized as follows:

- The log-likelihood ratio, for a given a priori information content value ($I_A$), has a Gaussian
distribution of mean $\mu_A \cdot x = \dfrac{\sigma_A^2}{2} \cdot x$,
where variance $\sigma_A^2 = (J^{-1}(I_a))^2$ and x is the transmitted bit sequence..
The conditional probability density function belonging to the L-value A is:

$$p_A(\xi |X=x)=\dfrac{1}{\sqrt{2\pi}\sigma_A} \cdot e^{-(\xi -(\sigma_A^2/2)\cdot x)^2/2\sigma_A^2}$$ - The extrinsic information content ($I_E$) is calculated using the log-likelihood values along with the received signal using the following formula: $$I_E=\dfrac{1}{2}\cdot\underset{x=-1,1}{\sum}\int_{-\infty}^{\infty}p_E(\xi|X=x)\cdot \log_2\dfrac{2\cdot p_E(\xi|X=x)}{p_E(\xi|X=-1)+p_E(\xi|X=1)}d\xi$$

The detailed derivation is explicitly stated in [2].