Documentation Index Fetch the complete documentation index at: https://ray-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
This walkthrough builds up a Ray Tune job step by step.
Install
pip install -U "ray[tune]"
Define an objective
The objective function takes a config dict and reports metrics with tune.report or by returning a dict.
from ray import tune
def objective ( config ):
score = config[ "a" ] ** 2 + config[ "b" ]
return { "score" : score}
Pick a search space
search_space = {
"a" : tune.uniform( - 1 , 1 ),
"b" : tune.choice([ 1 , 2 , 3 ]),
}
Run the tuner
tuner = tune.Tuner(
objective,
param_space = search_space,
tune_config = tune.TuneConfig( num_samples = 20 , metric = "score" , mode = "min" ),
)
results = tuner.fit()
best = results.get_best_result()
print (best.config, best.metrics)
Train models, not toy functions
Replace the toy objective with a real training function. Use tune.report to push metrics during training, then end the function to mark the trial complete.
def train ( config ):
model = build_model(config)
for epoch in range ( 10 ):
loss = train_one_epoch(model)
tune.report({ "loss" : loss, "epoch" : epoch})
Run with early stopping
Add a scheduler so bad trials get stopped early.
from ray.tune.schedulers import ASHAScheduler
tuner = tune.Tuner(
train,
param_space = search_space,
tune_config = tune.TuneConfig(
metric = "loss" ,
mode = "min" ,
scheduler = ASHAScheduler( max_t = 10 , grace_period = 1 , reduction_factor = 2 ),
num_samples = 50 ,
),
)
Use a smarter search algorithm
from ray.tune.search.optuna import OptunaSearch
tuner = tune.Tuner(
train,
param_space = search_space,
tune_config = tune.TuneConfig(
metric = "loss" ,
mode = "min" ,
search_alg = OptunaSearch(),
num_samples = 50 ,
),
)
Where to go next
Search space Uniform, log-uniform, choice, conditional, and grid searches.
Schedulers ASHA, HyperBand, PBT, PB2.