Information Theory
Applied Mathematics
Master Shannon's mathematical theory of information: entropy, mutual information, channel capacity, source coding, rate-distortion theory, and the deep connections to statistics, physics, and computation.
Learning Objectives
- Compute entropy and mutual information for discrete and continuous distributions
- Apply Shannon's source coding theorem to lossless compression
- Analyze channel capacity and apply the noisy channel coding theorem
- Understand rate-distortion theory and its role in lossy compression
- Work with differential entropy and Gaussian channel capacity
- Connect Kolmogorov complexity to Shannon entropy
Lessons
1
Entropy & Measures of Information
35 min
2
Mutual Information & KL Divergence
35 min
3
Source Coding & Data Compression
35 min
4
Channel Capacity & Shannon's Channel Coding Theorem
35 min
5
Rate-Distortion Theory & Lossy Compression
35 min
6
Differential Entropy & Gaussian Channels
35 min
7
Algorithmic Information Theory & Applications
35 min
Quick Practice
Test your knowledge with a quick interactive challenge from this module.
Loading…
Score:
0/0
Key Concept Flashcards
Loading…
1 / 1
Click the card to flip it