CSCI-B 659
Information Theory and Inference
Fall 2011


Claude E. Shannon


Instructor
Esfandiar Haghverdi
Lindley Hall 330C
E-mail:ehaghver@indiana.edu

Office Hours: By appointment

   

Lectures
Tuesday, Thursday
2:30-3:45 p.m.
Room: LH 008

 

Home

Weekly schedule


Description: This is a first course in Information Theory. I will try to cover the basics of information theory, for example as outlined in the first 10 chapters of the textbook below. However, my personal bias will be towards connections of the material we will be discussing to applications in statistical inference. The connections between information theory and statistics were observed and developed back in 1950s in the work of Kullback and Leibler, but there are several new applications of information theory in machine learning and other areas where inference plays a significant role. My basic plan for this course is to cover the basics of information theory first as I think this way the students will get almost all the material they need to tackle their own problems, however I will try to find time to discuss applications in statistics. I would like to end this brief description by a quote from the father of information theory, Claude E. Shannon (from IRE-Information Theory, 1956, page 3).

Indeed, the hard core of information theory is, essentially, a branch of mathematics, a strictly deductive system. A thorough understanding of the mathematical foundation and its communication application is surely a prerequisite to other applications.


Prerequisites:
Prior exposure to probability theory, some linear algebra, mathematical maturity, and willingness to think.


Tentative Syllabus:

  • Information, inference, and learning
  • Probability review
  • Entropy, joint entropy, conditional entropy
  • Relative entropy, mutual information
  • Inequalities and their applications
  • The law of large numbers and asymptotic equipartitioning
  • Markov chains, entropy rate, the second law of thermodynamics
  • Data compression
  • Channel capacity
  • Large deviations
  • Parameter estimation
  • Iterative algorithms
  • Universal Source Coding


Textbooks:

  • Elements of Information Theory, second edition, Thomas M. Cover and Joy A. Thomas, Wiley-Interscience, 2006.

Handouts and Homework: All handouts and homework assignments will be posted on Oncourse.

Grading:

  • Homework assignments: 20%
    • There will be biweekly homework assignments.
    • Solutions must be written LEGIBLY.
    • It is encouraged to discuss the problem sets with others, but everyone needs to turn in a unique personal write-up.
  • Miniprojects: 20%
    • These miniprojects might consist of problem sets to be completed, or essays to be written, or even reports on some research articles. All miniprojects will be completed by groups of size 2-3. All groups will have a common miniproject which will consist of writing an NSF style (with appropriate format and all parts) grant proposal for a project on information theory or its applications.
  • Midterm: 20%
    • Midterm is scheduled on October 20, 2011.
  • Final take-home exam: 40%.

Ground rules:

  • Collaborative work:
    One of the best ways to learn new material is to collaborate in groups. You may discuss the homework problems with your classmates, and in this way make the learning process more enjoyable. However, the homework you hand in must be your own work, in your own words and your own explanation. As for projects completed by groups, the contributions of each member must be clearly explained on a separate page.
  • Here is the link to The Code of Student Conduct.