Module Number INF3460 |
Module Title Information Theory |
Type of Module Elective Compulsory |
---|---|---|
ECTS | 6 | |
Work load - Contact time - Self study |
Workload:
180 h Class time:
60 h / 4 SWS Self study:
120 h |
|
Duration | 1 Semester | |
Frequency | In the winter semester | |
Language of instruction | English | |
Type of Exam | Regular exercises provided (with solutions afterwards), but no continuous assessment. -- A final written exam. This will be closed book, but you can take in one double sided A4 sheet of notes. |
|
Lecture type(s) | Lecture, Tutorial | |
Content | As the name indicates, this is a theory course, and is essentially mathematical, albeit heavily motivated by practical information processing problems. There will be no programming assignments. We will do some proofs (in lectures), in particular proving the source and channel coding theorems. Some exercises and exam questions will require you to make (simple) proofs. You will need to be able to do a range of mathematical calculations by hand. |
|
Objectives | The overall goal of the course is to present the basics of the theory of information. Concretely, this means the source and channel coding theorems which characterize how one can compress and transmit information. We will also meet some of the connections between information theory and machine learning. Depending on the pace I can run at, we might do some stuff on Kolmogorov complexity (information theory for finite sequences). You will learn about some practical compression schemes (Huffman and Arithmetic coding) and simple block codes for channel coding. You will meet some of the ideas of Bayesian inference. In doing all this you will learn some of the core notions of information theory including how to mathematically define information in the first place, entropy, relative entropy and the asymptotic equipartition property which is related to the law of large numbers, and which we will use to prove the key theorems of the course. |
|
Allocation of credits / grading |
Type of Class
Status
SWS
Credits
Type of Exam
Exam duration
Evaluation
Calculation
of Module (%) |
|
Prerequisite for participation | INF2021 (BIOINFM2021) Mathematics for Computer Science 4: Stochastics (Stochastics) | |
Lecturer / Other | Williamson | |
Literature | Text: David McKay, Information Theory, Inference and learning Algorithms. Freely available online at https://www.inference.org.uk/itprnn/book.pdf -- Pre-requisite : Probability theory – you need to know (elementary) probability theory. By “elementary” I do not mean having only an elementary understanding, but rather the style of probability theory usually learned by engineers – i.e. without the measure-theoretic machinery. If you have passed the course 'INF2021 Stochastik' you should be fine. If you have not done it, but reckon you know the material anyway, I provide a self-administered test to help you judge your degree of preparedness. |
|
Last offered | Wintersemester 2022 | |
Planned for | Sommersemester 2025 | |
Assigned Study Areas | BIOINFM2510, INFM2510, INFM3410, MDZINFM2510, MEINFM3210 |