Universiteit Leiden

nl en

Lezing

Joint Lectures on Evolutionary Algorithms (JoLEA)

Datum
woensdag 15 juni 2022
Tijd
Bezoekadres
Online

Talk 1: The impact of cost metrics on algorithm configuration
Finding the best configuration of algorithms’ hyperparameters for a given optimization problem is an important task in evolutionary computation. While performing an algorithm configuration task, we need to determine the objective, i.e., cost metric. Concerning different cost metrics, the configurators can present different performances and obtain various configurations. Therefore, it is interesting to investigate the impact of cost metrics on the configurators. In this talk, I will present our results of four different approaches for a family of genetic algorithms on 25 diverse pseudo-Boolean optimization problems. The results suggest that even when interested in expected running time (ERT) performance, it might be preferable to use anytime performance measures (AUC) for the configuration task. We also observe that tuning for expected running time is much more sensitive with respect to the budget that is allocated to the target algorithms.

Furong Ye is a Postdoc at the Leiden Institute of Advanced Computer Science (LIACS), after finishing his PhD study at LIACS. His PhD topic is “Benchmarking discrete optimization heuristics: From building a sound experimental environment to algorithm configuration”. He is part of the core development team of IOHprofiler, with a focus on the IOHexperimenter. His research interests are the empirical analysis of algorithm performance and (dynamic) algorithm configuration.

Talk 2: Benchmarking as a steppingstone to dynamic algorithm selection
When comparing optimization heuristics, we typically benchmark them on a pre-defined set of problems and check which one performs better. However, the benchmarking procedure gives us a lot more information than just which algorithm seems to work best in a particular context. The performance profile of an algorithm tells something about its underlying behavior, and this knowledge can potentially be exploited. By combining the performance trajectories of two different algorithms, we can obtain a theoretically dynamic algorithm, which performs a single switch during the optimization, and outperforms both component algorithms. In this talk, we will discuss how detailed benchmark data was used to show the potential of dynamic algorithm selection, and what further challenges remain.

Diederick Vermetten is a PhD candidate at the Leiden Institute of Advanced Computer Science (LIACS). His research interests include benchmarking of optimization heuristics, dynamic algorithm selection and configuration as well as hyperparameter optimization. He is part of the core development team of IOHprofiler, with a focus on the IOHanalyzer.

The series is organized by a team from four universities, initiated by prof. dr. Thomas Bäck (Leiden University), prof. dr. Peter A.N. Bosman (CWI), prof. dr. Gusz Eiben (Vrije Universiteit Amsterdam), and dr. ir. Dirk Thierens (Utrecht University).

Deze website maakt gebruik van cookies.  Meer informatie.