Wikia

Psychology Wiki

Channel capacity

Talk0
34,143pages on
this wiki
Revision as of 06:31, May 31, 2006 by Lifeartist (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


Channel capacity, is the amount of discrete information that can be reliably transmitted over a channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information transport rate (in units of information per unit time) that can be achieved with vanishingly small error probability.

Information theory, developed by Claude E. Shannon in 1948, defines the notion of channel capacity and provides a mathematical model by which one can compute the maximal amount of information that can be carried by a channel. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.

Mathematical DefinitionEdit

                               o---------o
                               |  Noise  |
                               o---------o
                                    |
                                    V
o--------o  M  o---------o  X  o---------o  Y  o---------o  M' o----------o
| Source |---->| Encoder |---->| Channel |---->| Decoder |---->| Receiver |  
o--------o     o---------o     o---------o     o---------o     o----------o

Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let p(x|y) be the conditional probability distribution function of X given Y. We will consider p(x|y) to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of f(x), the marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the amount of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:

 C = \max_f I(X;Y).\,

Noisy channel coding theoremEdit

The channel coding theorem states that for any  \epsilon > 0 and for any rate less than the channel capacity, there is an encoding and decoding scheme that can be used to ensure that the probability of block error is less than  \epsilon > 0 for sufficiently long message block M. Also, for any rate greater than the channel capacity, the probability of block error at the receiver goes to one as the block length goes to infinity.

See alsoEdit

de:Kanalkapazität

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Around Wikia's network

Random Wiki