# Advanced topics in information theory

(Difference between revisions)
 Revision as of 12:19, 26 June 2009 (view source) (New page: = Reading Group: Advanced Topics in Information Theory = Summer 2009 == Participants == == Topics == * Rate distortion theory (Weeks 1,2) * Network information theory (Weeks 3,4) ...)← Previous diff Revision as of 17:25, 4 July 2009 (view source) (→Topics)Next diff → Line 9: Line 9: == Topics == == Topics == - * Rate distortion theory (Weeks 1,2) + * Rate distortion theory + * Network information theory + * Kolmogorov complexity + * Quantum information theory - * Network information theory (Weeks 3,4) + == Sessions == - * Kolmogorov complexity (Weeks 5,6) + === July 7: Organization. Recap of CS-683 === + * Basic organization, presentation assignments. - * Quantum information theory (Weeks 7,8) + * Review of Information theory ideas + + * Entropy, AEP, Compression and Capacity + + Entropy of a random variable is given by $H(X) = -\sum_x p(x) \log p(x).$ + + The capacity of a channel is defined by $C = \max_{p(x)} I(X; Y).$ + + Compression and Capacity determine the two fundamental information theoretic limits of data transmission, + $H \leq R \leq C.$ + + * A review of Gaussain channels and their capacities. + + * Let us take these analysis one step further. How much do you loose when you cross these barriers? + + + === July 14:        Rate distortion theory - I === + === July 21:        Rate distortion theory - II === + === July 28:        Network Information theory- I === + === Aug 04:        Network Information theory- II === + === Aug 11:        Wireless networks, cognitive radios === + === Aug 18:        Multiple access channels, network coding techniques ===

Summer 2009

## Topics

• Rate distortion theory
• Network information theory
• Kolmogorov complexity
• Quantum information theory

## Sessions

### July 7: Organization. Recap of CS-683

• Basic organization, presentation assignments.
• Review of Information theory ideas
• Entropy, AEP, Compression and Capacity

Entropy of a random variable is given by

 H(X) = − ∑ p(x)logp(x). x

The capacity of a channel is defined by C = maxp(x)I(X;Y).

Compression and Capacity determine the two fundamental information theoretic limits of data transmission, $H \leq R \leq C.$

• A review of Gaussain channels and their capacities.
• Let us take these analysis one step further. How much do you loose when you cross these barriers?