In electrical engineering, computer science and information theory, channel capacity is the tight upper bound on the rate at which information can be reliably transmitted over a communications channel. By the noisychannel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.^{[1]}^{[2]}
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.^{[3]}
Contents

Formal definition 1

Shannon capacity of a graph 2

Noisychannel coding theorem 3

Example application 4

Channel capacity in wireless communications 5

AWGN channel 5.1

Frequencyselective channel 5.2

Slowfading channel 5.3

Fastfading channel 5.4

See also 6

Advanced Communication Topics 6.1

External links 7

References 8
Formal definition
Let X and Y be the random variables representing the input and output of the channel, respectively. Let p_{YX}(yx) be the conditional distribution function of Y given X, which is an inherent fixed property of the communications channel. Then the choice of the marginal distribution p_X(x) completely determines the joint distribution p_{X,Y}(x,y) due to the identity

\ p_{X,Y}(x,y)=p_{YX}(yx)\,p_X(x)
which, in turn, induces a mutual information I(X;Y). The channel capacity is defined as

\ C = \sup_{p_X(x)} I(X;Y)\,
where the supremum is taken over all possible choices of p_X(x).
Shannon capacity of a graph
If G is an undirected graph, it can be used to define a communications channel in which the symbols are the graph vertices, and two codewords may be confused with each other if their symbols in each position are equal or adjacent. The computational complexity of finding the Shannon capacity of such a channel remains open, but it can be upper bounded by another important graph invariant, the Lovász number.^{[4]}
Noisychannel coding theorem
The noisychannel coding theorem states that for any ε > 0 and for any transmission rate R less than the channel capacity C, there is an encoding and decoding scheme transmitting data at rate R whose error probability is less than ε, for a sufficiently large block length. Also, for any rate greater than the channel capacity, the probability of error at the receiver goes to one as the block length goes to infinity.
Example application
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signaltonoise ratio S/N is the Shannon–Hartley theorem:

C = B \log_2 \left( 1+\frac{S}{N} \right)\
C is measured in bits per second if the logarithm is taken in base 2, or nats per second if the natural logarithm is used, assuming B is in hertz; the signal and noise powers S and N are measured in watts or volts^{2}, so the signaltonoise ratio here is expressed as a power ratio, not in decibels (dB); since figures are often cited in dB, a conversion may be needed. For example, 30 dB is a power ratio of 10^{30/10} = 10^3 = 1000.
Channel capacity in wireless communications
This section^{[5]} focuses on the singleantenna, pointtopoint scenario. For channel capacity in systems with multiple antennas, see the article on MIMO.
AWGN channel
If the average received power is \bar{P} [W] and the noise power spectral density is N_0 [W/Hz], the AWGN channel capacity is

C_{\text{AWGN}}=W\log_2\left(1+\frac{\bar{P}}{N_0 W}\right) [bits/s],
where \frac{\bar{P}}{N_0 W} is the received signaltonoise ratio (SNR). This result is known as the Shannon–Hartley theorem.^{[6]}
When the SNR is large (SNR >> 0 dB), the capacity C\approx W\log_2 \frac{\bar{P}}{N_0 W} is logarithmic in power and approximately linear in bandwidth. This is called the bandwidthlimited regime.
When the SNR is small (SNR << 0 dB), the capacity C\approx \frac{\bar{P}}{N_0} \log_2 e is linear in power but insensitive to bandwidth. This is called the powerlimited regime.
The bandwidthlimited regime and powerlimited regime are illustrated in the figure.
AWGN channel capacity with the powerlimited regime and bandwidthlimited regime indicated. Here, \frac{\bar{P}}{N_o}=10^6.
Frequencyselective channel
The capacity of the frequencyselective channel is given by socalled water filling power allocation,

C_{N_c}=\sum_{n=0}^{N_c1} \log_2 \left(1+\frac{P_n^* \bar{h}_n^2}{N_0} \right),
where P_n^*=\max \left (\left(\frac{1}{\lambda}\frac{N_0}{\bar{h}_n^2} \right),0 \right) and \bar{h}_n^2 is the gain of subchannel n, with \lambda chosen to meet the power constraint.
Slowfading channel
In a slowfading channel, where the coherence time is greater than the latency requirement, there is no definite capacity as the maximum rate of reliable communications supported by the channel, \log_2 (1+h^2 SNR), depends on the random channel gain h^2, which is unknown to the transmitter. If the transmitter encodes data at rate R [bits/s/Hz], there is a nonzero probability that the decoding error probability cannot be made arbitrarily small,

p_{out}=\mathbb{P}(\log(1+h^2 SNR),
in which case the system is said to be in outage. With a nonzero probability that the channel is in deep fade, the capacity of the slowfading channel in strict sense is zero. However, it is possible to determine the largest value of R such that the outage probability p_{out} is less than \epsilon. This value is known as the \epsilonoutage capacity.
Fastfading channel
In a fastfading channel, where the latency requirement is greater than the coherence time and the codeword length spans many coherence periods, one can average over many independent channel fades by coding over a large number of coherence time intervals. Thus, it is possible to achieve a reliable rate of communication of \mathbb{E}(\log_2 (1+h^2 SNR)) [bits/s/Hz] and it is meaningful to speak of this value as the capacity of the fastfading channel.
See also
Advanced Communication Topics
External links

Hazewinkel, Michiel, ed. (2001), "Transmission rate of a channel",

AWGN Channel Capacity with various constraints on the channel input (interactive demonstration)
References

^ Saleem Bhatti. "Channel capacity". Lecture notes for M.Sc. Data Communication Networks and Distributed Systems D51  Basic Communications and Networks.

^ Jim Lesurf. "Signals look like noise!". Information and Measurement, 2nd ed.

^ Thomas M. Cover, Joy A. Thomas (2006). Elements of Information Theory. John Wiley & Sons, New York.

^ .

^ David Tse, Pramod Viswanath (2005), Fundamentals of Wireless Communication, Cambridge University Press, UK

^ The Handbook of Electrical Engineering. Research & Education Association. 1996. p. D149.


General



Software



Culture



Devices



Environment
and health



Law



Networking



This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.