Information theory is the science of operations on data such as compression, storage, and communication. It is among the few disciplines fortunate to have a precise date of birth: 1948, with the publication of Claude E. Shannon's paper entitled “A Mathematical Theory of Communication”.
This course is about how to measure, represent, and communicate information effectively. Why bits have become the universal currency for information exchange. How information theory bears on the design and operation of modern-day systems such as smartphones and the Internet. What are entropy and mutual information, and why are they so fundamental to data representation, communication, and inference. Practical compression and error correction. Relations and applications to probability, statistics, machine learning, biological and artificial neural networks, genomics, quantum information, and blockchains.
Lectures will focus on intuition, applications and ways in which communication and representation of information manifest in various areas. The material will be explored in more depth and rigor via videos of additional lectures (by the course instructors) made available to those interested. Homework and projects will be tailored to students’ backgrounds, interests, and goals. There will also be a fun outreach component.
We encourage everyone - from the techies to the literature majors - to enroll. Guaranteed learning, fun, contribution to social good, and new friendships with people from departments and schools other than your own. We’ll assume you’ve been exposed to basic probability at the level encountered in a first undergraduate course, or have the motivation to dedicate the first few weeks of the quarter to acquainting yourself (under our guidance) with this material.