Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Social psychology: Altruism · Attribution · Attitudes · Conformity · Discrimination · Groups · Interpersonal relations · Obedience · Prejudice · Norms · Perception · Index · Outline


File:Shannon-Weaver model.png

The Shannon–Weaver model as portrayed in a report from the United States Office of Technology Assessment

The Shannon–Weaver model of communication was one of the first models of communication has been called the "mother of all models."[1] It embodies the concepts of information source, message, transmitter, signal, channel, noise, receiver, information destination, probability of error, coding, decoding, information rate, channel capacity, etc.

In 1948 Claude Elwood Shannon published A Mathematical Theory of Communication article in two parts in the July and October numbers of the Bell System Technical Journal.[2] In this fundamental work he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time. Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing what became known as the dominant form of "information theory."

The book co-authored with Warren Weaver, The Mathematical Theory of Communication, reprints Shannon's 1948 article and Weaver's popularization of it, which is accessible to the non-specialist.[3] Shannon's concepts were also popularized, subject to his own proofreading, in John Robinson Pierce's Symbols, Signals, and Noise.[4]

The term Shannon–Weaver model was widely adopted into social science fields such as education, organizational analysis, psychology, etc. In engineering and mathematics, Shannon's theory is used more literally and is referred to as Shannon theory, or information theory[5].

Shannon's formula is ,

where C is channel capacity measured in bits/second, W is the bandwidth in Hz, S is the signal level in watts across the bandwidth W, and N is the noise power in watts in the bandwidth W.

References[]

  1. David D. Woods and Erik Hollnagel (2005). Joint Cognitive Systems: Foundations of Cognitive Systems Engineering, Boca Raton, FL: Taylor & Francis.
  2. Claude Shannon (1948). A Mathematical Theory of Communication. Bell System Technical Journal 27 (July and October): pp. 379–423, 623–656.
  3. Warren Weaver and Claude Elwood Shannon (1963). The Mathematical Theory of Communication, Univ. of Illinois Press.
  4. John Robinson Pierce (1980). An Introduction to Information Theory: Symbols, Signals & Noise, Courier Dover Publications.
  5. Sergio Verdü (2000). "Fifty years of Shannon theory" Sergio Verdü and Steven W. McLaughlin Information theory: 50 years of discovery, 13–34, IEEE Press.



This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement