Skip to main content

Documentation Index

Fetch the complete documentation index at: https://ray-preview.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Search algorithms decide which configurations Tune evaluates next. Random search is the default; for higher-dimensional or expensive search spaces, prefer a model-based algorithm.

Random search (default)

tune.TuneConfig(num_samples=50)
Use tune.grid_search in the search space:
search_space = {
    "lr": tune.grid_search([1e-3, 1e-4, 1e-5]),
    "wd": tune.grid_search([0.0, 1e-4]),
}

Optuna

Tree-structured Parzen Estimator and other Optuna samplers.
from ray.tune.search.optuna import OptunaSearch

OptunaSearch(metric="loss", mode="min")

Ax / BoTorch

Bayesian optimization with Gaussian processes.
from ray.tune.search.ax import AxSearch

AxSearch(metric="loss", mode="min")

BayesOpt

Classical Bayesian optimization for continuous spaces.
from ray.tune.search.bayesopt import BayesOptSearch

BayesOptSearch(metric="loss", mode="min", random_search_steps=10)

BOHB

Combines Bayesian optimization with HyperBand-style early stopping.
from ray.tune.search.bohb import TuneBOHB
from ray.tune.schedulers import HyperBandForBOHB

TuneBOHB()
HyperBandForBOHB(time_attr="training_iteration", max_t=100)
Use TuneBOHB together with HyperBandForBOHB.

HyperOpt

Tree of Parzen estimators with conditional spaces.
from ray.tune.search.hyperopt import HyperOptSearch

HyperOptSearch(metric="loss", mode="min")

Nevergrad

Evolutionary search.
from ray.tune.search.nevergrad import NevergradSearch
import nevergrad as ng

NevergradSearch(optimizer=ng.optimizers.OnePlusOne, metric="loss", mode="min")

Combine with a scheduler

Most search algorithms work alongside a scheduler. ASHA + Optuna is a strong default:
tune.TuneConfig(
    search_alg=OptunaSearch(),
    scheduler=ASHAScheduler(max_t=100, grace_period=2),
    num_samples=200,
    metric="loss",
    mode="min",
)
Some scheduler/search-alg combinations require specific pairings — for example, BOHB only works with HyperBandForBOHB.

Next steps

Schedulers

Pair a search algorithm with a scheduler.

Distributed tuning

Run searches across the cluster.