This guide walks through Ray’s primary entry points: tasks for stateless computations, actors for stateful services, and the built-in libraries for data, training, tuning, and serving.Documentation Index
Fetch the complete documentation index at: https://ray-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Make sure you’ve installed Ray before running the snippets below.
Start Ray
ray.init() starts a Ray runtime on your local machine if one isn’t already running. To connect to an existing cluster, pass an address: ray.init(address="auto").
Tasks: scale Python functions
Convert any Python function into a remote task with the@ray.remote decorator. Calls return object references that you resolve with ray.get.
Actors: stateful workers
Actors are remote classes that hold state across method calls. Use them for accumulators, models loaded into memory once, or any service that benefits from warm state.Object store
Pass large objects between tasks efficiently with the distributed object store.ray.put writes the array to the local object store once; downstream tasks read it zero-copy on the same node and over the network on others.
Try the AI libraries
Ray ships with high-level libraries that build on tasks and actors.Ray Data
Distributed data loading, transformation, and batch inference.
Ray Train
Distributed training for PyTorch, Lightning, Transformers, and more.
Ray Tune
Hyperparameter tuning at scale.
Ray Serve
Production model serving and online inference.
Run on a cluster
The same code runs unchanged when you connect to a Ray cluster. See Ray Clusters for cluster lifecycle management on Kubernetes, VMs, or on-premises hardware.Next steps
Ray Core walkthrough
Tour the full Ray Core API.
Key concepts
Tasks, actors, objects, placement groups, and resources.