Breadcrumb

Information Theory (MATH 34600)

Academic Year:

Contents of this document:


Administrative Information

  1. Unit number and title: MATH 34600 Information Theory
  2. Level: H/6
  3. Credit point value: 10 credit points
  4. Year: 12/13
  5. First Given in this form: 2005-06
  6. Unit Organiser: Milan Mosonyi
  7. Lecturer: Milan Mosonyi
  8. Teaching block: 2
  9. Prerequisites: MATH11300 Probability 1 OR Level 2 Physics ( MATH 11400 Statistics 1 is helpful, but not necessary)

Unit aims

To give a rigorous and modern introduction into Shannon's theory of information, with emphasis on fundamental concepts and mathematical techniques.

General Description of the Unit

Shannon's information theory is one of the great intellectual achievements of the 20th century, which half a century after its conception continues to inspire communications engineering and to generate challenging mathematical problems. Recently it has extended dramatically into physics as quantum information theory. The course is about the fundamental ideas of this theory: data compression, communication via channels, error correcting codes, and simulations.

It is a statistical theory, so notions of probability play a great role, and in particular laws of large numbers as well as the concept of entropy are fundamental (and will be discussed in the course). The course contains a discussion of two models of data compression, leading to entropy as the crucial quantity; an introduction to noisy channels and error correcting codes, culminating in Shannon's channel coding theorem; the "reverse Shannon theorem" for noisy channels and rate-distortion coding; information theoretical hypothesis testing - theory of identification.

The course aims at demonstrating information theoretical modelling, and the mathematical techniques required will be rigorously developed.

It is a natural companion to the Quantum Information course offered in Mathematics (MATH M5610), and to a certain degree to Cryptography B (COMSM 0007), offered in Computer Science, and Communications (EENG 22000), in Electrical Engineering. It may also be interesting to physicists having attended Statistical Physics (PHYS 30300).

Relation to Other Units

The probabilistic nature of the problems considered and of the mathematical modellings in information theory relates this unit to the probability and statistics units at Levels 4, 5 and 6. It is very much suited as a companion to the Quantum Information unit.

Related courses in Computer Science: Cryptography B, in Electrical Engineering: Communications, and in Physics: Statistical Physics.

Teaching Methods

Lectures. Exercises to be done by students, problem classes.

Learning Objectives

This unit should enable students to:

  • understand how information problems are modeled and solved;
  • model and solve problems of information theory: data compression and channel coding;
  • discuss basic concepts such as entropy, mutual information, relative entropy, capacity;
  • use information theoretical methods to tackle information theoretical problems, in particular probabilistic method and information calculus.

Assessment Methods

The final assessment mark for Information Theory is calculated from a 1½-hour written examination consisting of THREE questions. A candidate's best TWO answers will be used for assessment. Calculators of an approved type (non-programmable, no text facility) are allowed. From 2012-13 ONLY calculators carrying a 'Faculty of Science approved' sticker will be allowed in the examination room.

Award of Credit Points

Credit points for the unit are gained by:

  • either: a final assessment mark of 40 or more,
  • or: a final assessment mark of 30 or more AND a reasonable performance at specified homework during the course.

Transferable Skills

Mathematical - Knowledge of basic information theory; probabilistic reasoning.

General skills - Modelling, problem solving and logical analysis. Assimilation and use of complex and novel ideas.

Texts

There exist many textbooks on the elements of information theory. The course will follow such treatises only in the first half, and in the second presenting more modern material. Nevertheless, as a rigorous and affordable companion for the student, I can recommend

  • R B Ash. Information Theory, Dover Publications, 1990
  • T M Cover & J A Thomas. Elements of Information Theory, Wiley Interscience, 1991.                                                                      

 Other useful references are:

  • C E Shannon & W Weaver. The Mathematical Theory of Communication, University of Illinois Press, 1963. 
  • I Csiszar & J Koerner. Information Theory: Coding Theorems for Discrete Memoryless Systems (2nd ed.), Akademiai Klado, Budapest, 1997.

The course only requires elementary probability theory, but students who have taken further     probability will find some of the course content easier. A very good reference is

  • G R Grimmett & D Welsh. Probability: An Introduction, Oxford University Press, 1986.

Additional reading for the probabilistic method:

  • N Alon & J H Spencer. The Probabilistic Method (2nd ed.), Wiley Interscience, 2000.

Syllabus

  • Information sources and review of probability
  • Variable length and block data compression; entropy
  • Noisy channels
  • Error correcting codes
  • Shannon's channel coding theorem; mutual information and relative entropy
  • Further topics (time permitting); e.g., Gaussian channels and signal-to-noise ratio, cryptography, ...

Unit home page