Ion Stoica, the co-founder of Databricks and Anyscale, predicts 2023 will be the year of distributed AI frameworks. He has already had a hand in creating such a tool: Ray, an open-source platform from Anyscale. It helps power OpenAI’s ChatGPT and is used for other tasks to enable scalability for machine learning in this new era of generative AI.
In an interview with me, Stoica explained that Ray is an “ecosystem as a service” designed to support machine learning workloads since 2016. This makes it far superior than Spark, which was built at Berkeley to achieve distributed training but failed to support GPUs, leading to poor performance when working with deep learning models.
The Anyscale team behind Ray got more ambitious with their project over time, adding support for reinforcement learning while also exploring broader possibilities through the term “sky computing” they proposed back in 2021. Sky computing describes a new form of cloud computing based on interoperability and distributed computing – which could provide services like MLaaS (Machine Learning as a Service) or DaaS (Distributed Data Services).
At its core, Ray is about scaling ML capabilities in the generative AI era without compromising on performance or memory consumption – a feat most data processing engines such as Apache Spark have been unable to achieve until now.
Read more at thenewstack.io