Information Theory

Print

Objective

The aim of this course is to support the students in learning the principles, concepts and applications of Information Theory. Information theory is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of information, known as information entropy, is usually expressed by the average number of bits needed for storage or communication.

Course Contents

  • Concepts of entropy and information; channel capacity; channel coding; the Shannon’s theorem; error correction codes and decoding methods.
  • Basic definitions of probabilities.
  • Source coding.
  • Channel capacity.
  • Channel coding.
  • The Shannon’s theorem.
  • Error connection codes and decoding methods.
  • Cover T.M. & Thomas J.A. (2006): Elements of Information Theory, 2nd Edition, Wiley.
  • MacKay D.J.C. (2003): Information Theory, Inference, and Learning Algorithms, Cambridge University Press.