# Advanced topics in information theory

(Difference between revisions)
 Revision as of 17:25, 4 July 2009 (view source) (→Topics)← Previous diff Revision as of 17:37, 4 July 2009 (view source) (→July 7: Organization. Recap of CS-683)Next diff → Line 23: Line 23: * Entropy, AEP, Compression and Capacity * Entropy, AEP, Compression and Capacity - Entropy of a random variable is given by $H(X) = -\sum_x p(x) \log p(x).$ + Entropy of a random variable is given by - The capacity of a channel is defined by $C = \max_{p(x)} I(X; Y).$ + $H(X) = -\sum_{x \in \mathcal{X}} p(x) \log p(x).$ + + The capacity of a channel is defined by + + $\mathcal{C} = \max_{p(x)} I(X; Y).$ Compression and Capacity determine the two fundamental information theoretic limits of data transmission, Compression and Capacity determine the two fundamental information theoretic limits of data transmission, - $H \leq R \leq C.$ + $H \leq R \leq \mathcal{C}.$ * A review of Gaussain channels and their capacities. * A review of Gaussain channels and their capacities. - * Let us take these analysis one step further. How much do you loose when you cross these barriers? + * Let us take these analysis one step further. How much do you loose when you cross these barriers? + + * We saw one situation when you try to transmit over the capacity. By Fano's inequality + $H(X|Y) \leq H(E) + P_e (|\mathcal{X}|-1)$ + + * Rate distortion: A theory for lossy data compression. === July 14:        Rate distortion theory - I === === July 14:        Rate distortion theory - I ===

Summer 2009

## Topics

• Rate distortion theory
• Network information theory
• Kolmogorov complexity
• Quantum information theory

## Sessions

### July 7: Organization. Recap of CS-683

• Basic organization, presentation assignments.
• Review of Information theory ideas
• Entropy, AEP, Compression and Capacity

Entropy of a random variable is given by

$H(X) = -\sum_{x \in \mathcal{X}} p(x) \log p(x).$

The capacity of a channel is defined by

$\mathcal{C} = \max_{p(x)} I(X; Y).$

Compression and Capacity determine the two fundamental information theoretic limits of data transmission, $H \leq R \leq \mathcal{C}.$

• A review of Gaussain channels and their capacities.
• Let us take these analysis one step further. How much do you loose when you cross these barriers?
• We saw one situation when you try to transmit over the capacity. By Fano's inequality

$H(X|Y) \leq H(E) + P_e (|\mathcal{X}|-1)$

• Rate distortion: A theory for lossy data compression.