What is information? In 1948, Claude Shannon published a paper with a quantitative answer that founded the field of information theory, forming the basis for the science of information as well as the design and understanding of modern-day communication systems. Through lectures, invited talks and visits to various labs, we'll explore the elements of the science of information and their manifestation in various domains. We'll learn how information can be measured, represented, and communicated effectively, why bits have become the universal currency for information exchange, and how these ideas bear on the design and operation of modern-day systems such as smart phones, DVDs, and the Internet. We will also get a glimpse of ways in which elements of the science of information emerge in domains as varied as the neural codes of the brain, cryptographic codes for keeping secrets, genetics and the genetic code, quantum information, and entertainment.
The students will be guided through the creation of a podcast on one of the topics explored in the course, with assistance from Irena Fischer-Hwang, a PhD student in the electrical engineering department. Irena researches new methods for analyzing text-based and biological data, and shares her passion for scientific communication as a host and writer on the podcast Goggles Optional.