User:EvilPizza/Books/Information Theory

EEE 551

 * Information theory


 * Entropy, Relative Entropy, and Mutual Information
 * Entropy (information theory)
 * Joint entropy
 * Conditional entropy
 * Mutual information
 * Conditional mutual information
 * Kullback–Leibler divergence
 * Uniform boundedness
 * Data processing inequality
 * Markov chain
 * Fano's inequality


 * Asymptotic Equipartition Property
 * Asymptotic equipartition property
 * Typical set


 * Data Compression
 * Data compression
 * Shannon's source coding theorem
 * Prefix code
 * Kraft's inequality
 * Shannon coding
 * Huffman coding


 * Channel Capacity
 * Channel capacity
 * Noisy-channel coding theorem
 * Joint source and channel coding