4

Scaling AI & Python with Ray: v2.0 release

 1 year ago
source link: https://devm.io/python/ray-scaling-ai-python
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Ray, a distributed open source framework for scaling Python and artificial intelligence applications, received a new 2.0 update, adding a new beta feature and improvements for handling larger, more complex AI workloads.

The Ray framework is used to help simplify ML developer workloads by reducing machine learning compute and scaling their projects to a cluster.

Scaling for Python projects

The 2.0 release adds, among other things, more capabilities for simplifying heavy workloads and adds Ray AI Runtime (AIR).

Developers can use Ray to help scale their large, intensive workloads in order to make them leaner and more efficient. It can be used in any kind of application that is written in Python.

From the website’s landing page:

“Modern workloads like deep learning and hyperparameter tuning are compute-intensive, and require distributed or parallel execution. Ray makes it effortless to parallelize single machine code — go from a single CPU to multi-core, multi-GPU or multi-node with minimal code changes.”

According to its GitHub repo, Ray can run on any machine. It also has a growing number of libraries available with integrations, including Dask, Flambe, Classy Vision, and Hugging Face Transformers. This ecosystem will continue to grow and add more libraries.

What’s new in v2.0?

Version 2.0 is the first major update in two years and adds several new features and improvements to the framework. It was unveiled at the Ray Summit event which took place in San Francisco.

The largest new addition arriving with the 2.0 release is Ray AI Runtime (AIR). The Ray AIR API will help developers and data scientists by providing one single, unified machine learning application toolkit that will be able to scale single workloads.

AIR is currently still in beta and potentially interested users must first fill out a form before it is generally available for all to test out. Check out the getting started guide for information on how to apply for access and try out the beta version.

View the AIR ecosystem map for more information about its current state, what’s stable, what’s maintained by the community, and track what’s currently in progress.

Version 2.0 also improves upon Ray’s capabilities of working with more complex, larger AI workloads.

The latest version can be installed on Windows (currently in beta), MacOS, and Linux.

Further info

Review the official documentation for Ray here.

Naturally, since Ray is open source, it’s possible to get involved with the project and help out. Find out more information about how you can lend a hand on GitHub and help file bug reports, answer questions that other users have, or even join the monthly MeetUp group.

As the popularity of Python increases and machine learning becomes more commonplace, Ray will likely continue to have a presence as a powerful AI tool. Have you added it to your toolkit?


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK