The measurement and quantification of information. These ideas are applied to the probabilistic analysis of the transmission of information over a channel along which random distortion of the message occurs.
At the level of The Theory of Information and Coding, McEliece; or Information Theory, Ash
- Definitions: measure of uncertainty (entropy), measure of information, examples; input, output, and channel terminology
- Fundamental Theorem of Information Theory for the discrete memoryless channel
- Information sources, ergodicity, and the Shannon-McMillan Theorem. Examples and consequences
- Other topics, for example error detecting and correcting codes, etc.