Course Details

Subject {L-T-P / C} : EE6139 : Information Theory and Coding {3-0-0 / 3}
Subject Nature : Theory
Coordinator : Prof. Prasanna Kumar Sahu


1. Information Theory: Entropy, Relative Entropy and Mutual Information: Entropy Rates of a Stochastic Process: Data Compression: Differential Entropy: The Gaussian Channel: Maximum Entropy and Spectral Estimation: Information Theory and Statistics
2. Coding: Block Codes, Cyclic Codes, BCH Codes, Reed Solomon Codes, Convolution Codes, Turbo Codes, Low Density Parity Check Codes

Course Objectives

  1. To equip students with the basic understanding of the fundamental concept of entropy and information as they are used in communications
  2. To enhance knowledge of probabilities, entropy, measures of information
  3. To guide the student through the implications and consequences of fundamental theories and laws of information theory and coding theory with reference to the application in modern communication and computer systems

Course Outcomes

Calculate the information content of a random variable from its probability distribution.
Relate the joint, conditional, and marginal entropies of variables in terms of their coupled probabilities.
Define channel capacities and properties using Shannon's Theorems.
Construct efficient codes for data on imperfect communication channels.
Generalize the discrete concepts to continuous signals on continuous channels.

Essential Reading

  1. Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, John Wiley , 2014 Edition
  2. Jorge C. Moreira and Patric G Farrell, Essentials of Error Control and Coding, John Wiley , 2015 Edition

Supplementary Reading

  1. John C. Hancock, An Introduction to the Principles of Communication Theory, McGraw-Hill , 2015 Edition
  2. Borda, Monica, Fundamentals in Information Theory and Coding, Springer , 2011 edition