SE 376. Information Theory and Ergodic Phenomenon
(3 - 0 - 0 - 0 - 4)
Shannon's information measure -history and axiomatic development
Relative entropy and mutual information
Law of large numbers and ergodic theorem
Shannon-McMillan -Breiman (Fundamental) theorem for countable and uncountable alphabet
Noiseless coding theorem and data compression
Kolmogorov complexity
A universal (Lempel-Ziv) compression algorithm and optimality for ergodic processes
Differential entropy, Rate distortion theorem (and quantization)
Channel capacity
Coding theorems for discrete and continuous (in time and with alphabet) channels and Gaussian channels
Large deviations and relative entropy, application to statistics and statistical mechanics
Extension of Shannon's information (e.g. Renyi's information)
Distance measures and applications
References
- Robert Ash (1965) Information Theory; Wiley-Inter Science, Reprinted in paperback by Dover Publications. (1990).
- T.M. Cover and J.A. Thomas (1991) Elements of Information Theory; Wiley.
- Toby Berger (1971) Rate distortion Theory- A Mathematical Basis for Data Compression, Prentice Hall.
- Robert Ash (1975), Topics in Stochastic Processes, Academic Press.
- C. Arndt (2001), Information Measures-Information and its Description in Science and Engineering, Springer.
- Amir Dembo and Ofer Zeitouni (1998), Large Deviation Techniques and Applications, Springer.
- Selected issues of Annals of Probability and IEEE Transactions on Information Theory.
- Prerequisites-
- An exposure to probability through ESO 209 or an equivalent or a higher level course and mathematical orientation)
- Desirable backgroud:
- Exposure to communication theory / mathematical analysis / advanced probability theory.
- Proposed by:
- Dr. R.K. Bansal, Department of Electrical Engg.
- Semester:
- Odd
- Eligibility:
- Fourth year students.