NewsCurrently, no news are available
In this course we will introduce the foundations of machine learning (ML). In particular, we will focus on understanding the theoretical aspects of ML that have made ML successful in a wide range of applications such as bioinformatics, computer vision, information retrieval, computer linguistics, robotics, etc.
The course gives a broad introduction into machine learning methods. After the lecture the students should be able to solve and analyze learning problems. The lecture is based on the previous machine learning courses offered by Matthias Hein and Peter Ochs.
The tentative list of topics cover:
- Probability theory
- Maximum Likelihood/Maximum A Posteriori Estimators
- Bayesian decision theory
- Linear classification and regression
- Model selection and evaluation
- Convex Optimization
- Kernel methods
- Societal Impact of Machine Learning
- Unsupervised learning (Clustering, Dimensionality Reduction)
- Introduction to Deep Learning
Prerequisites: The course is targeted to students in computer science, bioinformatics, math, and related areas with a mathematical background. Students should know linear algebra and have good basic knowledge of statistics, for example by having taken Mathematics for Computer Scientists I and II (for linear algebra) and Statistics Lab or Mathematics for Computer Scientists III (for statistics). In addition, prior attendance to machine learning related courses, e.g., Elements of Machine Learning, is considered as additional useful background.
For questions, please send us an email to: firstname.lastname@example.org
Lectures will start on April 19th! (one week after semester starts).
In the lectures we will cover the main theoretical aspects of the course, and in the Tutorials we will guide students through the resolution of exercises. There with be both theoretical and coding exercises. The main programming language used in the course will be Python (and PyTorch). Exercises are thought to help students preparing for the exam but there won't be assignments for the students to hand-in. Details on the tentative planning of the course.
Lectures*: Mondays 14:15h & Wednesdays at 16:15h (on Zoom)
Tutorials*: (Tentatively) Thursdays at 12:15h (on Zoom)
Lectures and tutorials will be recorded and made available in Youtube.
Privacy disclaimer: We have decided to use zoom for both lectures and tutorials, as it provides superior functionality and usability for lecturing, including seamless live interaction and smooth integration of a whiteboard. We thus encourage you to join the Zoom with your real name, your camera on, and ask questions verbally. However, this is of course voluntary. If you are concerned about privacy, we encourage you to enter the zoom meeting under a nickname or pseudonym, and use only the textual chat for communication.
The course will be evaluated in a final exam (which will most likely take place in written form). There won't be intermediate evaluation or admission to the exam. However, over the semester, we will use the tutorials to jointly solve exercises in order to support students to prepare for the exam.
[Bishop] Bishop, C. M. Pattern recognition and machine learning. Springer, 2006
[Boyd] Boyd, S., Boyd, S. P., and Vandenberghe, L. . Convex optimization. Cambridge university press, 2004
[eML] Hastie, T., Tibshirani, R., and Friedman, J. The Elements of Statistical Learning. Springer, 2009.
[Kernels] Smola, A. J., & Schölkopf, B. Learning with kernels. GMD-Forschungszentrum Informationstechnik, 1998.
[DL] Goodfellow, Ian, Yoshua Bengio, Aaron Courville, and Yoshua Bengio. Deep learning. MIT press, 2016.