Show/hide contentOpenClose All
Curricular information is subject to change
- Understand the general relevance of Shannon's Information Theory in the Information Age.
- Review essential probability theory.
- Become acquainted with fundamental information-theoretical concepts such as entropy, mutual information, relative entropy: Jensen's inequality, log-sum inequality, data-processing inequality, sufficient statistics, Fano's inequality.
- Understand the centrality of the asymptotic equipartition property in Information Theory: typical set.
- Understand the fundamentals of data compression: Kraft inequality, optimal codes, Huffman codes, Shannon-Fano-Elias coding.
- Understand the concept of channel capacity: symmetric channels, channel coding theorem, elementary channel coding techniques (repetition, Hamming codes).
- Acquire the basic insights into the connection between Information Theory and statistics.
(see above)
Student Effort Type | Hours |
---|---|
Lectures | 24 |
Tutorial | 16 |
Autonomous Student Learning | 80 |
Total | 120 |
Working knowledge of basic calculus and algebra.
Learning Recommendations:Knowledge of probability theory would be helpful, although the course is self-contained in this respect.
Description | Timing | Component Scale | % of Final Grade | ||
---|---|---|---|---|---|
Assignment: Three sets of exercises and questions to be completed individually by the students. | Throughout the Trimester | n/a | Alternative linear conversion grade scale 40% | No | 75 |
Multiple Choice Questionnaire: Midterm online examination. | Unspecified | n/a | Alternative linear conversion grade scale 40% | No | 25 |
Resit In | Terminal Exam |
---|---|
Spring | Yes - 2 Hour |
• Feedback individually to students, post-assessment
Not yet recorded.