Updates on the campus response to coronavirus (COVID-19)

ECE Course Syllabus

ECE6605 Course Syllabus


Information Theory (3-0-3)

Technical Interest

ECE 3075


Catalog Description
To introduce the mathematical theory of communications. Emphasis will be placed on Shannon's theorems and their use in the analysis and design of communication systems

Cover, Elements of Information Theory (2nd edition), Wiley Interscience, 1991. ISBN 9780471241959 (required)

Indicators (SPIs)
SPIs are a subset of the abilities a student will be able to demonstrate upon successfully completing the course.

Topical Outline
Entropy and Mutual Information Theory
-Joint Entropy, Conditional Entropy
-Data Processing Theorem
-Fano's Inequality

Asymptotic Equipartition Principle
-Typical Sequences
-Entropy, Source Coding and the AEP
-Joint Typicality (Neuhoff/Forney notes)

Entropy Rate
-Conditional Independence and Markov Chains
-Entropy Rate

Lossless Source Coding
-Kraft Inequality
-Shannon and Huffman Codes
-Shannon, Fano, Elias Codes
-Arithmetic Codes
-Lempel Ziv Codes

Channel Capacity
-Symmetric Channels
-Discrete Memoryless Channels and Their Capacity
-Arimoto-Blahut Algorithm
-Proof of the Channel Coding Theorem
-Converse of Channel Coding Theorem

Differential Entropy
-Entropy, Mutual Information, AEP for Continuous rv's

Gaussian Channel
-Capacity of AWGN, Bandlimited AWGN Channels
-Capacity of Nonwhite Channels:  Water Filling

Rate Distortion Theory
-Rate Distortion Functions
-Vector Quantization
-Vector Quantization Gains
-Vector Quantization Design

Multiuser Information Theory (as time allows)

Information Theory and Statistics (as time allows)