Package 'mlrintermbo'

Title: Model-Based Optimization for 'mlr3' Through 'mlrMBO'
Description: The 'mlrMBO' package can ordinarily not be used for optimization within 'mlr3', because of incompatibilities of their respective class systems. 'mlrintermbo' offers a compatibility interface that provides 'mlrMBO' as an 'mlr3tuning' 'Tuner' object, for tuning of machine learning algorithms within 'mlr3', as well as a 'bbotk' 'Optimizer' object for optimization of general objective functions using the 'bbotk' black box optimization framework. The control parameters of 'mlrMBO' are faithfully reproduced as a 'paradox' 'ParamSet'.
Authors: Martin Binder [aut, cre]
Maintainer: Martin Binder <[email protected]>
License: LGPL-3
Version: 0.5.1-2
Built: 2024-11-23 06:05:45 UTC
Source: https://github.com/mb706/mlrintermbo

Help Index


mlrintermbo: Model-Based Optimization for 'mlr3' Through 'mlrMBO'

Description

The 'mlrMBO' package can ordinarily not be used for optimization within 'mlr3', because of incompatibilities of their respective class systems. 'mlrintermbo' offers a compatibility interface that provides 'mlrMBO' as an 'mlr3tuning' 'Tuner' object, for tuning of machine learning algorithms within 'mlr3', as well as a 'bbotk' 'Optimizer' object for optimization of general objective functions using the 'bbotk' black box optimization framework. The control parameters of 'mlrMBO' are faithfully reproduced as a 'paradox' 'ParamSet'.

Author(s)

Maintainer: Martin Binder [email protected]

See Also

Useful links:


Create Surrogate Learner

Description

Creates the default mlrMBO surrogate learners as an mlr3::Learner.

This imitates the behaviour of mlrCPO when no learner argument is given to mbo() / initSMBO().

Usage

makeMlr3Surrogate(
  is.numeric = TRUE,
  is.noisy = TRUE,
  has.dependencies = !is.numeric
)

Arguments

is.numeric

(logical(1))
Whether only numeric parameters are present. If so, a LearnerRegrKM (DiceKriging package) is constructed. Otherwise a LearnerRegrRanger (random forest from the ranger package) is constructed. Default is TRUE.

is.noisy

(logical(1))
Whether to use nugget estimation. Only considered when is.numeric is TRUE. Default is TRUE.

has.dependencies

(logical(1))
Whether to anticipate missing values in the surrogate model design. This adds out-of-range imputation to the model. If more elaborate imputation is desired, it may be desirable to set this to FALSE and instead perform custom imputation using mlr3pipelines. Default is !numeric.

Examples

# DiceKriging Learner:
makeMlr3Surrogate()

# mlr3pipelines Graph: imputation %>>% 'ranger' (randomForest):
makeMlr3Surrogate(is.numeric = FALSE)

# just the 'ranger' Learner:
makeMlr3Surrogate(is.numeric = FALSE, has.dependencies = FALSE)

OptimInstanceMultiCrit Class

Description

bbotk's OptimInstanceMultiCrit class. Re-exported since bbotk will change the name.


OptimInstanceSingleCrit Class

Description

bbotk's OptimInstanceSingleCrit class. Re-exported since bbotk will change the name.


Optimizer Class

Description

bbotk's Optimizer class. Re-exported since bbotk will change the name.


Tuner and Optimizer using mlrMBO

Description

mlrMBO tuning object.

mlrMBO must not be loaded directly into R when using mlr3, for various reasons. TunerInterMBO and OptimizerInterMBO take care that this does not happen.

Format

R6::R6Class object inheriting from Tuner (mlr3tuning package) or Optimizer (bbotk package).

Construction

To optimize an objective (using the bbotk package), use the OptimizerInterMBO object, ideally obtained through the bbotk::opt() function: opt("intermbo").

To tune a machine learning method represented by a mlr3::Learner object, use the TunerInterMBO obtained ideally through mlr3tuning::tnr(): tnr("intermbo").

Both have following optional arguments:

  • n.objectives :: integer(1)
    Number of objectives to optimize. Default is 1 for ordinary ("single objective") optimization, but can be breater than 1 for multi-objective optimization. See mlrMBO::setMBOControlMultiObj() for details on multi-objective optimization in mlrMBO.

  • on.surrogate.error :: character(1)
    What to do when fitting or predicting the surrogate model fails. One of "stop" (throw error), "warn", and "quiet"(ignore and propose a random point).
    The surrogate model may fail sometimes, for example when the size of the initial design is too small or when the objective function returns constant values. In practice this is usually safe to ignore for single iterations (i.e. "warn" or "quiet"), but be aware that MBO effectively degrades to random search when the surrogate model fails for all iterations.

Configuration Parameters

The ParamSet of the optimizer / tuner reflects the possible configuration options of mlrMBO. The control parameters map directly to the arguments of mlrMBO::makeMBOControl(), mlrMBO::setMBOControlInfill(), mlrMBO::setMBOControlMultiObj(), mlrMBO::setMBOControlMultiPoint(), and mlrMBO::setMBOControlTermination().

Examples

library("paradox")
library("bbotk")

# silly example function: minimize x^2 for -1 < x < 1
domain <- ps(x = p_dbl(lower = -1, upper = 1))
codomain <- ps(y = p_dbl(tags = "minimize"))
objective <- ObjectiveRFun$new(function(xs) list(y = xs$x^2), domain, codomain)

# initialize instance
instance <- OptimInstanceSingleCrit$new(objective, domain, trm("evals", n_evals = 6))

# use intermbo optimizer
#
# Also warn on surrogate model errors
# (this is the default and can be omitted)
optser <- opt("intermbo", on.surrogate.error = "warn")

# optimizer has hyperparameters from mlrMBO
optser$param_set$values$final.method <- "best.predicted"

# optimization happens here.
optser$optimize(instance)

instance$result

TuningInstanceMultiCrit Class

Description

mlr3tuning's TuningInstanceMultiCrit class. Re-exported since mlr3tuning will change the name.


TuningInstanceSingleCrit Class

Description

mlr3tuning's TuningInstanceSingleCrit class. Re-exported since mlr3tuning will change the name.