CS-683

From CYPHYNETS

Jump to: navigation, search

CS-683: Information theory

Course description. Media:CS683-outline-Spring-2009.pdf

Instructor: Dr Abubakr Muhammad

Year: 2008-09

Office: 313 (PDC Bldg)

Email: abubakr@lums.edu.pk

Quarter: Spring

Office Hours: TBA

Category: Graduate

Course Code: CS 683

Course Title: Information theory

Units: 3

Course Description:

Introduction to information theory and coding; limits of data communication and data compression; channel and source coding; rate distortion theory.

Course Status:

Elective course for graduate students in CS and CmpE.

Pre-requisites:

Stochastic Systems-I or Random Processes or background in advance probability

Goals

  • Introduce basic concepts in information theory.
  • Understand fundamental limits on communication and compression.
  • Theoretical treatment of channel and source coding schemes.

Text book:

Elements of Information Theory by Cover and Thomas

Lectures and Examinations:

  1. Two weekly lectures, each of 75 minutes duration (Mondays, Wednesdays)
  2. One in-class midterm examination
  3. Comprehensive final examination
  4. Home works and quizzes

Policies

  • Assignments to be submitted by 5:00pm on due date.
  • No late submissions without prior permission.
  • Re-grading can be requested after grade reporting, within following time limits:
    1. HW and Quizzes: 2 days
    2. Midterm and Final: 3 days


TA for the course:

TBA

Grading Scheme:

Homework/Quiz 25%

Midterm 35%

Final 40%


Tentative course schedule and topics
Week

Start date

Monday (75 min) Wednesday (75 min)
1

March 23

Overview. Entropy as measure of

information [Ch.1,2]

2

March 30

Joint and conditional entropy.

Mutual information [Ch. 2]

Relative entropy, Jensen’s

Inquality. [Ch 2]

3

April 6

Data processing inequality,

2nd Law of thermodynamics [Ch 2]

Sufficient statistics, Fano’s

inequality [Ch 2]

4

April13

Asymptotic equipartition

principle (AEP) [Ch 3]

AEP (contd.), entropy rates [Ch 3,

4]

5

April20

Markov chains, Hidden

Markov models [Ch 4]

Review
6

April27

Midterm Intro to data compression, Kraft’s

inequality [Ch 5]

7

May 4

Optimal codes, Huffman

coding [Ch 5]

Arithmetic codes, Optimality

analysis [Ch 5]

8

May 11

Channel models, channel

capacity [Ch 8]

Channel coding theorem [Ch 8]
9

May 18

Channel coding theorem

(contd.) [Ch 8]

Differential entropy, AEP for

continuous random variables [Ch 9]

10

May 25

Capacity of Gaussian

channels [Ch 10]

Capacity of Gaussian channels

(contd.) [Ch 10]

11

June 1

Review session
Personal tools