ECE Course Outline

ECE6605

Information Theory (3-0-3)

Prerequisites
ECE 3075
Corequisites
None
Catalog Description
To introduce the mathematical theory of communications. Emphasis will be placed on Shannon's theorems and their use in the analysis and design of communication systems
Textbook(s)
Cover, Elements of Information Theory (2nd edition), Wiley Interscience, 1991. ISBN 9780471241959 (required)

Topical Outline
Entropy and Mutual Information Theory
-Joint Entropy, Conditional Entropy
-Data Processing Theorem
-Fano's Inequality

Asymptotic Equipartition Principle
-Typical Sequences
-Entropy, Source Coding and the AEP
-Joint Typicality (Neuhoff/Forney notes)

Entropy Rate
-Conditional Independence and Markov Chains
-Entropy Rate

Lossless Source Coding
-Kraft Inequality
-Shannon and Huffman Codes
-Shannon, Fano, Elias Codes
-Arithmetic Codes
-Lempel Ziv Codes

Channel Capacity
-Symmetric Channels
-Discrete Memoryless Channels and Their Capacity
-Arimoto-Blahut Algorithm
-Proof of the Channel Coding Theorem
-Converse of Channel Coding Theorem

Differential Entropy
-Entropy, Mutual Information, AEP for Continuous rv's

Gaussian Channel
-Capacity of AWGN, Bandlimited AWGN Channels
-Capacity of Nonwhite Channels:  Water Filling

Rate Distortion Theory
-Quantization
-Rate Distortion Functions
-Vector Quantization
-Vector Quantization Gains
-Vector Quantization Design

Multiuser Information Theory (as time allows)

Information Theory and Statistics (as time allows)