Advanced topics in information theory

From CYPHYNETS

(Difference between revisions)
Jump to: navigation, search
(New page: = Reading Group: Advanced Topics in Information Theory = Summer 2009 == Participants == == Topics == * Rate distortion theory (Weeks 1,2) * Network information theory (Weeks 3,4) ...)
(Topics)
Line 9: Line 9:
== Topics ==
== Topics ==
-
* Rate distortion theory (Weeks 1,2)
+
* Rate distortion theory  
 +
* Network information theory
 +
* Kolmogorov complexity
 +
* Quantum information theory
-
* Network information theory (Weeks 3,4)
+
== Sessions ==
-
* Kolmogorov complexity (Weeks 5,6)
+
=== July 7: Organization. Recap of CS-683 ===
 +
* Basic organization, presentation assignments.
-
* Quantum information theory (Weeks 7,8)
+
* Review of Information theory ideas
 +
 
 +
* Entropy, AEP, Compression and Capacity
 +
 
 +
Entropy of a random variable is given by <math>H(X) = -\sum_x p(x) \log p(x).</math>
 +
 
 +
The capacity of a channel is defined by <math>C = \max_{p(x)} I(X; Y).</math>
 +
 
 +
Compression and Capacity determine the two fundamental information theoretic limits of data transmission,
 +
<math>H \leq R \leq C.</math>
 +
 
 +
* A review of Gaussain channels and their capacities.
 +
 
 +
* Let us take these analysis one step further. How much do you loose when you cross these barriers?
 +
 
 +
 
 +
=== July 14:        Rate distortion theory - I ===
 +
=== July 21:        Rate distortion theory - II ===
 +
=== July 28:        Network Information theory- I ===
 +
=== Aug 04:        Network Information theory- II ===
 +
=== Aug 11:        Wireless networks, cognitive radios ===
 +
=== Aug 18:        Multiple access channels, network coding techniques ===

Revision as of 17:25, 4 July 2009

Contents

Reading Group: Advanced Topics in Information Theory

Summer 2009

Participants

Topics

  • Rate distortion theory
  • Network information theory
  • Kolmogorov complexity
  • Quantum information theory

Sessions

July 7: Organization. Recap of CS-683

  • Basic organization, presentation assignments.
  • Review of Information theory ideas
  • Entropy, AEP, Compression and Capacity

Entropy of a random variable is given by

H(X) = − p(x)logp(x).
x

The capacity of a channel is defined by C = maxp(x)I(X;Y).

Compression and Capacity determine the two fundamental information theoretic limits of data transmission, H \leq R \leq C.

  • A review of Gaussain channels and their capacities.
  • Let us take these analysis one step further. How much do you loose when you cross these barriers?


July 14: Rate distortion theory - I

July 21: Rate distortion theory - II

July 28: Network Information theory- I

Aug 04: Network Information theory- II

Aug 11: Wireless networks, cognitive radios

Aug 18: Multiple access channels, network coding techniques

Personal tools