Channel capacity is defined to be the maximum rate at which information can be reliably transmitted through a channel. The fundamental theorem of information theory says that at any rate below channel capacity, an error control code can be designed whose probability of error is arbitrarily small.
Channel Capacity is considered to be the basic information theoretic performance measure for a communication channel. According to Claude E. Shannon’s study, channel capacity is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution:
$$ \textit{C} = \sup_{p_X \left(x\right)} I(X;Y) $$ with $X$ and $Y$ being the input and output symbol of the communication system respectively.From Shannon–Hartley theorem, channel capacity of an analog communication channel subject to additive white Gaussian noise (AWGN) can be calculated as following: $$ C = W \log_{2}\left(1+\frac{S}{N}\right) $$ where
$C$ is the channel capacity in bits per second;
$W$ is the bandwidth of the channel in hertz;
$S$ is the average received signal power in watts;
$N$ is the average power of the noise and interference in watts;
$\frac{S}{N}$ is the signal-to-noise ratio (SNR) of the communication signal to the noise and interference at the receiver.