Advanced topics in information theory


Revision as of 09:23, 22 August 2009 by Abubakr (Talk | contribs)
(diff) ←Older revision | Current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search
Courtesy. Elements of Information theory, by Cover and Thomas.
Courtesy. Elements of Information theory, by Cover and Thomas.

Reading Group: Advanced Topics in Information Theory

Calendar: Summer 2009

Venue: LUMS School of Science & Engineering

Organizer: Abubakr Muhammad

This group meets every week at LUMS to discuss some advanced topics in information theory.

This is a continuation of our formal course at LUMS, CS-683: Information theory (offered most recently in Spring 2008). We hope to cover some advanced topics in information theory as well as its connections to other fundamental disciplines such as statistics, mathematics, physics and technology.



  • Mubasher Beg
  • Shahida Jabeem
  • Qasim Maqbool
  • Muhammad Bilal
  • Muzammad Baig
  • Hassan Mohy-ud-Din
  • Zartash Uzmi
  • Shahab Baqai
  • Abubakr Muhammad


Include the following, but not limited to:

  • Rate distortion theory
  • Network information theory
  • Kolmogorov complexity
  • Quantum information theory


July 7: Organization. Recap of CS-683

Information theory as extremal points of communication. Courtesy. Elements of Information theory, by Cover and Thomas.
Information theory as extremal points of communication. Courtesy. Elements of Information theory, by Cover and Thomas.

  • Basic organization, presentation assignments.
  • Review of Information theory ideas
  • Entropy, AEP, Compression and Capacity

Entropy of a random variable is given by

H(X) = -\sum_{x \in \mathcal{X}} p(x) \log p(x).

The capacity of a channel is defined by

\mathcal{C} = \max_{p(x)} I(X; Y).

Compression and Capacity determine the two fundamental information theoretic limits of data transmission, H \leq R \leq \mathcal{C}.

  • A review of Gaussain channels and their capacities.
  • Let us take these analysis one step further. How much do you loose when you cross these barriers?
  • We saw one situation when you try to transmit over the capacity. By Fano's inequality

P_e \geq 1 - \frac{C}{R} - \frac{1}{nR}

  • Rate distortion: A theory for lossy data compression.


  • Elements of Information theory by Cover and Thomas.

July 14: Rate distortion theory - I

  • Rate–distortion provides the theoretical foundations for lossy data compression.
  • We try to find answer to the following question: Given an acceptable level of distortion, what is the minimal information that should be sent over a channel, so that the source can be reconstructed (up to that level of distortion) at the receiver?
  • Quantization for a single random variable. Given a distribution for a random variable, what are the optimal choices for quantization? Answer is Lloyd's algorithm (closely related to k-means clustering).
  • What about multiple random variables, treated at the same time? Even if the RVs are IID, quantizing them in sequences can result in better performance. (stated without proof).
  • Define distortion D. When is a distortion pair (R,D) achievable?
  • The rate distortion function R(D) is the min rate R such that (R,D) is achievable for a given D.
  • There is also an information theoretic definition.

  • We can show that both are equivalent.
  • Proof follows closely the treatment on channel capacity.

July 21: Rate distortion theory - II

July 28: Network Information theory- I

Aug 04: Network Information theory- II

Aug 11: Wireless networks, cognitive radios

Aug 18: Multiple access channels, network coding techniques

Personal tools