@ray.remote(num_gpus=1)
class Example:
def say_hello(self, txt):
return f"hello {txt}"
actors = [Example.remote() for _ in range(8)]
hello_object_refs = [a.say_hello.remote("world") for a in actors]
ray.get(hello_object_refs)
Also, it has RDMA. Last I checked, Ray did not support RDMA.
There are probably other differences as well, but the lack of RDMA immediately splits the world into things you can do with ray and things you cannot do with ray
There's also Dask, which can do distributed pandas and numpy operations etc. However it was originally developed for traditional HPC systems and has only limited support for GPU computing. https://www.dask.org/
cwp|4 months ago
Monarch:
Ray:lairv|4 months ago
porridgeraisin|4 months ago
Also, it has RDMA. Last I checked, Ray did not support RDMA.
There are probably other differences as well, but the lack of RDMA immediately splits the world into things you can do with ray and things you cannot do with ray
unnah|4 months ago
disattention|4 months ago
https://pytorch.org/blog/pytorch-foundation-welcomes-ray-to-...