News

Written on 09.04.25 by Armin Beck

Tutorials

Due to the public holiday on April 21st, there will be no tutorials on that day. Instead, we offer an additional tutorial on Tuesday, April 22nd, at 8:15 in lecture hall HS001.
We kindly ask all students from the Monday tutorials to attend one of the other available tutorial sessions… Read more

Tutorials

Due to the public holiday on April 21st, there will be no tutorials on that day. Instead, we offer an additional tutorial on Tuesday, April 22nd, at 8:15 in lecture hall HS001.
We kindly ask all students from the Monday tutorials to attend one of the other available tutorial sessions during the week.

SMART Hour

Please note the following changes regarding the SMART Hour:

  • Due to the public holiday, the SMART Hour originally scheduled for April 18th will be moved to April 16th at 12:15 in Günter-Hotz lecture hall.

  • On April 25th, the SMART Hour will take place in lecture hall HS II in building E.2.5.

Machine Learning

In this course we will introduce the foundations of machine learning (ML). In particular, we will focus on understanding the theoretical aspects of ML that have made ML successful in a wide range of applications such as bioinformatics, computer vision, information retrieval, computer linguistics, robotics, etc.

The course gives a broad introduction into machine learning methods from a theoretical point of view. After the lecture the students should be able to solve and analyze learning problems. 

The tentative list of topics cover:

  • Probability theory
  • Maximum Likelihood/Maximum A Posteriori Estimators
  • Bayesian decision theory
  • Linear classification and regression
  • Model selection and evaluation
  • Convex Optimization
  • Kernel methods
  • Societal Impact of Machine Learning
  • Unsupervised learning (Clustering, Dimensionality Reduction)
  • Introduction to Deep Learning

PrerequisitesThe course is targeted to students in computer science, bioinformatics, mathematics, and related areas with a mathematical background. Students should know linear algebra, calculus and have a good knowledge of statistics, for example by having taken Mathematics for Computer Scientists I and II and Statistics Lab or Mathematics for Computer Scientists III (for statistics). The course will be accompanied with programming exercises in Python, hence programming skills are expected. In addition, prior attendance to machine learning related courses, e.g., Elements of Machine Learning, is helpful but not required.

Organizational Information

Lectures: (start on April 10th)

  • Mondays: 14:15h at Building E2.2  -- Günter-Hotz Lecture hall I (0.01)
  • Thursdays: 12:15h  at Building E2.2  -- Günter-Hotz Lecture hall I (0.01) 

Tutorials (to pick between, start on April 21st, participation not mandatory):  

  • Mondays: 12:15h at Building E1.3 -- HS002
  • Mondays: 16:15h at Building E1.3 -- HS002
  • Tuesday: 16:15h at Building E1.3 -- HS003
  • Wednesday: 08:15h at Building E1.3 -- HS003
  • Wednesday: 16:15h at Building E1.3 -- HS003

SMART Hour (Student Mentoring and Assistance in Real Time, start on April 18th):

  • Friday: 14:15h at Building E2.2  -- Günter-Hotz Lecture hall I (0.01) (except for 18.04., 25.04. and 06.06 !)

Exams: (no qualification requirement; except for registration in LSF)

  • Main exam: 21.7. from 9 am - 12 pm
  • Re-exam: 26.9. from 9 am to 12 pm

 

Bibliography

[Bach] Bach, F. Learning Theory from First Principles. Lecture Notes, available online, 2024.

[Bishop] Bishop, C. M. Pattern recognition and machine learning. Springer, 2006

[DSH] Duda, R. O., Hart, P.E., and Stork, D.G. Pattern classification (2nd edition). Wiley-Interscience 2000

[Boyd]   Boyd, S., Boyd, S. P., and Vandenberghe, L. . Convex optimization. Cambridge university press, 2004

[eML]    Hastie, T., Tibshirani, R., and Friedman, J. The Elements of Statistical Learning. Springer, 2009.

[Kernels] Smola, A. J.,  & Schölkopf, B. Learning with kernels. GMD-Forschungszentrum Informationstechnik, 1998.

[DL]   Goodfellow, I., Courville, A., and Bengio, Y. Deep learning. MIT press, 2016.

[DL]   Simon J.D. Prince Understanding Deep Learning. MIT press, 2023.

[Klenke]   Klenke, A. Probability Theory: A Comprehensive Course. Springer, 2006.

[Kallenberg]   Kallenberg, O. Foundations of Modern Probability. Springer, 2021.

 

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.