Skip to main content

Documentation Index

Fetch the complete documentation index at: https://ray-preview.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

A stopper ends one or more trials when a condition is satisfied. Distinct from a scheduler, a stopper considers global state, not relative trial performance.

Built-in stoppers

from ray.tune.stopper import (
    MaximumIterationStopper,
    TimeoutStopper,
    ExperimentPlateauStopper,
    TrialPlateauStopper,
    CombinedStopper,
)

stop = CombinedStopper(
    MaximumIterationStopper(max_iter=100),
    TimeoutStopper(timeout=60 * 60),
    TrialPlateauStopper(metric="loss", num_results=5, std=1e-3),
)

tuner = tune.Tuner(..., run_config=ray.train.RunConfig(stop=stop))
StopperWhen it fires
MaximumIterationStopperAfter max_iter reported iterations.
TimeoutStopperAfter elapsed wall-clock time.
TrialPlateauStopperWhen a trial’s metric stops improving.
ExperimentPlateauStopperWhen the best metric across trials stops improving.

Dict stop conditions

For simple stops, pass a dict:
ray.train.RunConfig(stop={"loss": 0.01, "training_iteration": 50})
Stops when any condition is met.

Custom stoppers

Subclass Stopper:
from ray.tune.stopper import Stopper

class MyStopper(Stopper):
    def __init__(self):
        self.best = float("inf")

    def __call__(self, trial_id, result):
        return result.get("loss", float("inf")) < 1e-3

    def stop_all(self):
        return False
__call__ decides whether to stop a single trial; stop_all stops the entire experiment.

Next steps

Schedulers

Per-trial early stopping.

Distributed tuning

Run experiments across the cluster.