One promising approach constructs explicit regression models to describe the dependence of target algorithm performance on parameter settings

however, this approach has so far been limited to the optimization of few numerical algorithm parameters on single instances

In this paper, we extend this paradigm for the first time to general algorithm configuration problems, allowing many categorical parameters andoptimization for sets of instances.

We experimentally validate our new algorithm configuration procedure by optimizing a local search and a tree search solver for the propositional satisfiability problem (SAT)

as well as the commercial mixed integer programming (MIP) solverCPLEX.

In these experiments, our procedure yielded state-of-the-art performance, and in many cases outperformed the previous best configuration approach

SMBO的限制:

These limitations include a focus on deterministic target algorithms;

use of costly initial experimental designs;

reliance on computationally expensive models;

and the assumption that all target algorithm runs have the same execution costs


prevent its use for general algorithm configuration tasks:

(1) it only supports numericalparameters;

(2) it only optimizes target algorithm performance for single instances; and

(3) it lacks a mechanism for terminating poorly performing target algorithm runs early.


The main contribution of this paper is to remove the first two of these SMBO limita-tions, and thus to make SMBO applicable to general algorithm configuration problemswith many categorical parameters and sets of benchmark instances


he simple model-freeRandom Online Adaptive Racing (ROAR) procedure and the more sophisticated Sequen-tial Model-based Algorithm Configuration (SMAC) method.

These methods do not yet implement an early termination criterion for poorly performing target algorithm runs(并没有实现早停)

thus, so far we expect them to perform poorly on some configuration scenarios with large captimes.
希望在大的上限时间中表现不佳。

We studied the components of this automated procedure, demon-strated that its intensification mechanism mattered mos

SPO:sequential parameter optimization

We also eliminated the need for a costly initial design by interleaving randomly selected parameters throughout the optimization process

The resulting time-bounded SPO variant,TB-SPO, is thefirst practical SMBO method for parameter optimization given a user-specified time budge.

More formally, as an instantiation ofthe SMBO framework,ROARis completely specified by the four components Initialize,FitModel,SelectConfigurations, and Intensify.Initialize performs a single run with the target algorithm’s default parameter configuration (or a random configuration if no default is available) on an instance selected uniformly at random.