# Advanced topics in information theory

### From CYPHYNETS

(Difference between revisions)

(→Topics) |
(→July 7: Organization. Recap of CS-683) |
||

Line 23: | Line 23: | ||

* Entropy, AEP, Compression and Capacity | * Entropy, AEP, Compression and Capacity | ||

- | Entropy of a random variable is given by | + | Entropy of a random variable is given by |

- | The capacity of a channel is defined by <math>C = \max_{p(x)} I(X; Y).</math> | + | <math>H(X) = -\sum_{x \in \mathcal{X}} p(x) \log p(x).</math> |

+ | |||

+ | The capacity of a channel is defined by | ||

+ | |||

+ | <math>\mathcal{C} = \max_{p(x)} I(X; Y).</math> | ||

Compression and Capacity determine the two fundamental information theoretic limits of data transmission, | Compression and Capacity determine the two fundamental information theoretic limits of data transmission, | ||

- | <math>H \leq R \leq C.</math> | + | <math>H \leq R \leq \mathcal{C}.</math> |

* A review of Gaussain channels and their capacities. | * A review of Gaussain channels and their capacities. | ||

- | * Let us take these analysis one step further. How much do you loose when you cross these barriers? | + | * Let us take these analysis one step further. How much do you loose when you cross these barriers? |

+ | |||

+ | * We saw one situation when you try to transmit over the capacity. By Fano's inequality | ||

+ | <math>H(X|Y) \leq H(E) + P_e (|\mathcal{X}|-1)</math> | ||

+ | |||

+ | * Rate distortion: A theory for lossy data compression. | ||

=== July 14: Rate distortion theory - I === | === July 14: Rate distortion theory - I === |

## Revision as of 17:37, 4 July 2009

# Reading Group: Advanced Topics in Information Theory

Summer 2009

## Participants

## Topics

- Rate distortion theory
- Network information theory
- Kolmogorov complexity
- Quantum information theory

## Sessions

### July 7: Organization. Recap of CS-683

- Basic organization, presentation assignments.

- Review of Information theory ideas

- Entropy, AEP, Compression and Capacity

Entropy of a random variable is given by

The capacity of a channel is defined by

Compression and Capacity determine the two fundamental information theoretic limits of data transmission,

- A review of Gaussain channels and their capacities.

- Let us take these analysis one step further. How much do you loose when you cross these barriers?

- We saw one situation when you try to transmit over the capacity. By Fano's inequality

- Rate distortion: A theory for lossy data compression.