Logo der Uni Stuttgart
BCJR Algorithm

BCJR Algorithm is the algorithm deployed in this MAP decoder for convolutional codes. It is named after: Bahl, Cocke, Jelinek, and Raviv. It has many variants, one of which is implemented here, namely the Log-Map decoder.

The Log-Map BCJR algorithm can be summarized as follows:

  1. Calculate the branch metrics ($\gamma_{k}$): $$\gamma_{k}=\dfrac{1}{2} \cdot \left ( \left ( \dfrac{2}{\sigma^2} \cdot y_{k}^s + L_a(k) \right) \cdot x_k^s + \left( \dfrac{2}{\sigma^2} \cdot y_k^p \cdot x_k^p \right) \right)$$
  2. Where:

    •$\sigma$ is the noise variance, depends on the SNR of the channel.

    •$y_k^s$ and $y_k^p$ are the received systematic and parity bits at the $k^{th}$ stage, respectively.

    •$x_k^s$ and $x_k^p$ are the trellis template systematic and parity bits at the $k^{th}$ stage, respectively.

    •$L_a(k)$ is the prior log-likelihood of the received bit at the $k^{th}$ stage

  3. Forward recursion to calculate forward metrics ($\alpha_k$): $$\alpha_{k+1}(s)=\rm{Jacobian}(\left \{ \alpha_k(s'): \forall s' \right \}),$$ Where: $$\rm{Jacobian(x,y)}=\rm{max}(x,y)+\ln (1+e^{-\left | x-y \right |})$$ Where the previous derivation is based on the linear formula: $$\alpha_{k+1}(s)=\sum\limits_{s'}^{}\gamma_k(s',s) \cdot \alpha_k(s')$$ Where: s' represents any previous state to state s and $\alpha_0 = 0$ for a terminated trellis i.e. $s(0) = s_0$.
  4. Backward recursion to calculate backward metrics ($\beta_k$): $$\beta_{k}(s')=\rm{Jacobian}(\left \{ \beta_{k+1}(s): \forall s \right \}),$$ Where the previous derivation is based on the linear formula: $$\beta_{k}(s')=\sum\limits_{s}^{}\gamma_k(s',s) \cdot \beta_{k+1}(s)$$ Where: s' represents any previous state to state s and $\beta_n = 0$ for a terminated trellis i.e. $s(n) = s_0$, where n is the length of the received sequence.