Information theory is the science of operations on data such as compression, storage, and communication. It is among the few disciplines fortunate to have a precise date of birth: 1948, with the publication of Claude E. Shannon's paper entitled “A Mathematical Theory of Communication”.
Our course will explore the basic concepts of Information theory. It is a prerequisite for research in this area, and highly recommended for students planning to delve into the fields of communications, data compression, and statistical signal processing. The intimate acquaintance that we will gain with measures of information and uncertainty - such as mutual information, entropy, and relative entropy - would be invaluable also for students, researchers, and practitioners in fields ranging from neuroscience to machine learning. Also encouraged to enroll are students of statistics and probability, who will gain an appreciation for the interplay between information theory, combinatorics, probability, and statistics.