Registration for this course is open until Monday, 11.05.2026 23:59.

News

Welcome & Getting ready

Written on 08.04.26 (last change on 08.04.26) by Isabel Valera

Dear all,

Welcome to the Machine Learning lecture of the SoSe26. As you may know, the lecture will start at 2.15pm in the GHH on April 13, when I will also share the organizational details of the course. 

In the meantime, we have uploaded some background materials for you to revise and get ready… Read more

Dear all,

Welcome to the Machine Learning lecture of the SoSe26. As you may know, the lecture will start at 2.15pm in the GHH on April 13, when I will also share the organizational details of the course. 

In the meantime, we have uploaded some background materials for you to revise and get ready for the course content. You can also see the tentative schedule for the course at the bottom of the page. 

See you on Monday,

Prof. Isabel Valera

 

 

Machine Learning

 

In this course we will introduce the mathematical foundations of machine learning (ML). In particular, we will focus on understanding the theoretical aspects of ML that have made ML successful in a wide range of applications such as bioinformatics, computer vision, information retrieval, computer linguistics, robotics, etc.

The course gives a broad introduction into machine learning methods from a theoretical point of view. After the lecture the students should be able to solve and analyze learning problems. The lecture is based on the previous machine learning courses offered by Matthias Hein and Peter Ochs.

The tentative list of topics cover:

  • Probability theory
  • Maximum Likelihood/Maximum A Posteriori Estimators
  • Bayesian decision theory
  • Linear classification and regression
  • Model selection and evaluation
  • Convex Optimization
  • Kernel methods
  • Societal Impact of Machine Learning
  • Unsupervised learning (Clustering, Dimensionality Reduction)
  • Introduction to Deep Learning

 

Prerequisites: The course is targeted to students in computer science, bioinformatics, math, and related areas with a mathematical background. Students should know linear algebra and have good basic knowledge of statistics, for example by having taken Mathematics for Computer Scientists I and II (for linear algebra) and Statistics Lab or Mathematics for Computer Scientists III (for statistics). In addition, prior attendance to machine learning related courses, e.g., Elements of Machine Learning, is considered as additional useful background.

 

Organizational Information

Lectures:

  • Mondays at 14:15h  -- Building E2.2, Günter-Hotz Lecture hall I (0.01) 
  • Thursdays at 12:15h -- Building E2.2, Günter-Hotz Lecture hall I (0.01) 

Lectures will start on April 13th! 

Tutorials (to pick between):  

  • Mondays at 12:15h -- Building E1.3, Room HS002
  • Mondays at 16:15h -- Building E1.3, Room HS001
  • Tuesdays at 10:15h -- Building E1.3, Room HS001
    • 05.05.26 and 07.07.26 at Building E2.2, Room GHH
  • Tuesdays at 12:15h -- Building E1.3, Room HS002
  • Wednesdays at 14:15h -- Building E1.3, Room HS002
    • 06.05.26 at Building E1.3, Room HS001
  • Wednesdays at 16:15h -- Building E1.3, Room HS002

Tutorials will start on April 20th!

Exams: 

  • Mid-term (admission) exam: 21.05.2026
  • Main exam: 04.08.2026
  • Re-exam: 28.09.2026

 

Tentative Schedule:

  Week Date Lecture Nr. Title

1. Introduction to ML

1 13-Apr 1 Introduction
1 16-Apr 2 Bayesian Decision Theory
2 20-Apr 3 Empirical Risk Minimization
  2 23-Apr   No Class

2. Linear Supervised ML

3 27-Apr 4 Linear Regression I
3 30-Apr 5 Linear Regression II
4 4-May 6 Linear Classification
4 7-May 7 Performance Measures I
5 11-May 8 Performance Measures II
  5 14-May Holiday No Class
  6 18-May   Introduction to Project
  6 21-May   Mid-term Exam
  7 25-May Holiday No Class

3. Unsupervised Learning

7 28-May 9 Clustering 
8 1-Jun 10 Dimensionality Reduction
  8 4-Jun Holiday No class

4. SVM and kernel methods

9 8-Jun 11 Convex Optimization I
9 11-Jun 12 Convex Optimization II
10 15-Jun 13 Linear SVM
10 18-Jun 14 Intro to Kernels
11 22-Jun 15

Learning with Kernels

  11 25-Jun   No Class / Maybe Q&A

5. Deep learning

12 29-Jun 16 Deep Learning I - Feedforward Nets + BP
12 2-Jul 17 Deep Learning II - CNNs
13 6-Jul 18

Deep Learning III - Transformers

13 9-Jul 19

Beyond Supervised DL, and Q&A

6. Societal impact

14 13-Jul 20 Societal Impact
14 16-Jul 21 Explainability & Interpretability

 

Bibliography

[Bishop] Bishop, C. M. Pattern recognition and machine learning. Springer, 2006

[DSH] Duda, R. O., Hart, P.E., and Stork, D.G. Pattern classification (2nd edition). Wiley-Interscience 2000

[Boyd]   Boyd, S., Boyd, S. P., and Vandenberghe, L. . Convex optimization. Cambridge university press, 2004

[eML]    Hastie, T., Tibshirani, R., and Friedman, J. The Elements of Statistical Learning. Springer, 2009.

[Kernels] Smola, A. J.,  & Schölkopf, B. Learning with kernels. GMD-Forschungszentrum Informationstechnik, 1998.

[DL]   Goodfellow, I., Courville, A., and Bengio, Y. Deep learning. MIT press, 2016.

[DL]   Simon J.D. Prince Understanding Deep Learning. MIT press, 2023.

[DL]   Bishop, C. M. & Bishop, H Deep Learning: Foundations and Concepts. Springer, 2024.

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.