Information Theory

Information Theory 3 credit 3 hours
This class starts from the review of random process and mainly provides an introduction to information theory. The following is a rough description of the material covered in this class.
1. Entropy, Relative Entropy, and Mutual information,
2. Asymptotic equipartition theorem,
3. Entropy rates of stochastic processes,
4. Data compression (Huffman coding, Shannon codes, Kraft inequality, etc.),
5. Information theory and gambling,
6. Channel Capacity,
7. Differential entropy,
8. Gaussian Channels.