News

Permanent Room for the Exercises

Written on 24.10.25 by Milan Bacchetta

The tutorials will take place in SR1 in E2.5 from next week on.

 

 

Provisional Exercise Room

Written on 22.10.25 by Milan Bacchetta

We have not found a permanent room yet, but SR15 in E1.3 is free, so we will have our first exercise class there tomorrow at 8:30. 

Information on the Exercises

Written on 21.10.25 by Milan Bacchetta

  • As the survey results indicate that the exercise class will take place in the 8:00–10:00 time block.
  • We will start at 8:30 and end at 10:00.
  • We are still looking for a suitable room and will update you as soon as one is confirmed.
  • There will be a class this Thursday with an… Read more
  • As the survey results indicate that the exercise class will take place in the 8:00–10:00 time block.
  • We will start at 8:30 and end at 10:00.
  • We are still looking for a suitable room and will update you as soon as one is confirmed.
  • There will be a class this Thursday with an introduction.
  • The first sheet has been uploaded and is due next Tuesday.

Lecture hall today

Written on 21.10.25 by Peter Ochs

Dear all, 

the lecture today takes place as originally announced in HSIII, E2.5 and starts at 12:30 o'clock.

Best,

NSO25-Team

Survey for the Date of the Exercises

Written on 16.10.25 by Milan Bacchetta

A survey has been set up in the Forum. It will run for 5 days. Hopefully we will find something for everyone. We will let you know on Tuesday which time has been chosen.

First lecture on Thursday

Written on 13.10.25 by Peter Ochs

Dear students, 

the lecture starts on Thursday 10.15 am in Lecture Hall III in E2.5.

Best, 

NSO25-Team

Show all

Non-smooth Analysis and Optimization in Data Science

In many Data Science applications, non-smooth features arise naturally, e.g., in the analysis of sparsity, low rank structure, or low complexity regularization in general. Constrained optimization problems can be considered as non-smooth objective functions. Envelope functions, dual Lagrangian relaxations, and many other (algorithmically) important functions are naturally non-smooth. Thereby, non-smoothness is clearly not just an artifact, it is a feature that usually comes with a certain structure. The key is to recognize good structures that can be leveraged. Ignoring the non-smoothness leads to inaccurate solutions or slow algorithms.

The course consists of two parts: (1) basic and advanced concepts of non-smooth analysis and (2) optimization algorithms for non-smooth non-convex optimization that are widely used in non-convex optimization for data science (Machine Learning, Computer Vision, Image Processing, and Statistics, ...). The first milestone in Part 1 is the exploration of Fermat's Rule, a first order optimality condition for non-smooth functions. This requires us to introduce generalized derivatives and is a means to formulate commonly employed Lagrange multiplier rules or KKT conditions as a special case. In order to study algorithms in Part 2 to solve non-smooth non-convex optimization problems, we first introduce the class of semi-algebraic and tame functions that comprise all practically relevant functions, but exclude pathological function. This leads to insights into recent breakthroughs in non-smooth and non-convex optimization that yield strong convergence results to a stationary point, based on the so-called Kurdyka-Lojasiewicz inequality. Besides many classical applications, this lecture presents several research topics in optimization for machine learning, for example,

  • the convergence of SGD, a popular stochastic gradient method, to stationary points,
  • the mathematical formulation of non-smooth automatic differentiation as implemented in PyTorch,
  • and bilevel optimization, which can be seen as a formalization of parameter learning problems.

After taking the course, students know about the most relevant concepts of non-smooth analysis and optimization, beyond convexity. They are able to read and understand related scientific literature. Moreover, they can rate the difficulty of optimization problems arising in applications in machine learning or computer vision and select an efficient algorithm accordingly. Moreover, they are able to perform the convergence analysis for non-convex optimization algorithms and they develop basic skills in solving practical problems with Python.

 


Prerequisites: Basics of Mathematics
(e.g. Linear Algebra 1-2, Analysis 1-3, Mathematics 1-3 for Computer Science)

Organizational Information

Lectures: 

  • Tuesday 12-14 c.t., HSIII in E2.5.
  • Thursday 10-12 c.t., HSIII in E2.5.

Oral Exams: 

  • Main exam: 05.02.2026
  • Re-exam: 09.04.2026.

Qualification requirements:

  • Submit solutions for the weekly exercise sheets.
  • Work in groups of 2 or 3 people.
  • Present your solutions (at least twice).
  • Gain 60% of the total points.
  • Register to the exam in LSF.

Literature:

  • T. Rockafellar and R. J.-B. Wets: Variational Analysis. Springer-Verlag Berlin Heidelberg, 1998.
  • T. Rockafellar: Convex Analysis. Princeton University Press, 1970.
  • Y. Nesterov: Introductory Lectures on Convex Optimization - A Basic Course. Kluwer Academic Publishers, 2004.
  • D. P. Bertsekas: Convex Analysis and Optimization. Athena Scientific, 2003.
  • S. Boyd: Convex Optimization. Cambridge Univeristy Press, 2004.
  • H. H. Bauschke and P. L. Combettes: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, 2011.

 

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.