Registration for this course is open until Sunday, 12.04.2026 23:59.

News

Currently, no news are available

Continuous Optimization

Lecture: Tuesday 8-10 c.t. in HS003, E1.3
Lecture: Thursday 8-10 c.t. in HS003, E1.3
Date of First Lecture: Tuesday, 07. April, 2026

Core Lecture for Mathematics and Computer Science
Language: English
Prerequisites: Basics of Mathematics
(e.g. Linear Algebra 1-2, Analysis 1-3, Mathematics 1-3 for Computer Science)

 

Description:

Optimization methods or algorithms seek for a state of a system that is optimal with respect to an objective function. Depending on the properties of the objective function, different strategies may be used to find such an optimal state (or point). The fundamental knowledge of the classes of functions that can be optimized, the properties of available optimization strategies, and the properties of the optimal points are crucial for appropriately modeling practical real world problems as optimization problems. An exact model of the real world that cannot be solved is as useless as a too simplistic model that can be solved easily.

This lecture introduces the basic algorithms, concepts and analysis tools for several fundamental classes of continuous optimization problems. The lecture covers the basics of generic Descent Methods, Gradient Descent, Newton Method, Quasi-Newton Method, Gauss-Newton Method, Conjugate Gradient, linear programming, non-linear programming, as well as optimality conditions for unconstrained and constrained optimization problems. These may be considered as the classical topics of continuous optimization. Some of these methods will be implemented and explored for practical problems in the tutorials.

After taking this course, students will have an overview of classical optimization methods and analysis tools for continuous optimization problems, which allows them to model and solve practical problems. Moreover, in the tutorials, some experience will be gained to implement and numerically solve practical problems.

 

The table of contents of the lecture is as follows:

  1. Introduction
    • Mathematical Optimization
    • Applications
    • Performance of Numerical Methods
    • Existence of a Solution
    • The Class of Convex Optimization Problems

  2. Unconstrained Optimization
    • Optimality Conditions
    • Descent Methods
    • Gradient Descent Method
    • Conjugate Gradient Method
    • Newton’s Method
    • Quasi-Newton Methods
    • Gauss-Newton Method
    • Computing Derivatives

  3. Constrained Optimization
    • Motivation
    • Optimality Conditions for Constrained Problems
    • Method of Feasible Directions
    • Linear Programming
    • Quadratic Programming
    • Sequential Quadratic Programming (SQP)
    • Penalty and Barrier Methods

 

Examination:

Qualification conditions will be given in the first lecture. 

Depending on the number of participants: oral or written exam.

 

Literature:

The lecture is based on the following literature, which is available via the library:

  • J. Nocedal und S. J. Wright: Numerical Optimization. Springer, 2006.
  • F. Jarre und J. Stoerr: Optimierung. Springer, 2004.
  • D. Bertsekas: Nonlinear Programming. Athena Scientific, 1999.
  • Y. Nesterov: Introductory Lectures on Convex Optimization - A Basic Course. Kluwer Academic Publisher, 2004.
  • T. Rockafellar and R. J.-B. Wets: Variational Analysis. Springer-Verlag Berlin Heidelberg, 1998.

 

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.