Hyperparameter Optimisation for Machine Learning, Summer Term 2024

Material

Overview

Hyperparameter Optimisation (HPO), also known as Hyperparameter Tuning, is the process of searching for a configuration of hyperparameters that yields good performance of a model or a learning algorithm. This process is challenging, and many machine learning algorithms are expensive to run and evaluate.

In this lab, you will get hands-on experience with different HPO methods. Starting with both simple grid and random search, you will learn how to implement and analyse HPO from scratch. The lab then introduces more sophisticated model-free and model-based approaches such as Bayesian optimisation. You will focus on having clean, robust, and efficient code.

The lab will be graded according to the following elements:

The lab is fully conducted in English.

Prerequisites

Good knowledge of Python programming language. Familiarity with basic ML models (e.g. linear regression, decision trees, random forests, SVMs, …etc.).

Registration

Registration to the seminar is handled via the SuPra system.

Organisers

Photo of Holger H. Hoos Prof. Dr. Holger H. Hoos Chair Holder, Alexander von Humboldt Professor

E-mail: hh[at]aim[dot]rwth-aachen[dot]de
Phone: +49 241 80 21451

Photo of Nick Kocher M.Sc. Nick Kocher PhD Student

E-mail: kocher[at]aim[dot]rwth-aachen[dot]de

Photo of Hadar Shavit M.Sc. Hadar Shavit PhD Student

E-mail: shavit[at]aim[dot]rwth-aachen[dot]de

Photo of Wadie Skaf M.Sc. Wadie Skaf PhD Student

E-mail: skaf[at]aim[dot]rwth-aachen[dot]de

Photo of Thijs Snelleman M.Sc. Thijs Snelleman Research Programmer

E-mail: snelleman[at]aim[dot]rwth-aachen[dot]de