Tech Brief

On-Demand Distributed Compute

Develop Complex Models Faster

Distributed software frameworks and compute clusters help data science teams solve the most complex machine learning problems — those that require large amounts of data and enormous processing power.

However, manually configuring clusters takes specialized DevOps skills and can be very time-consuming. That’s why Domino provides self-service access to the three most popular distributed compute frameworks – Spark, Ray, and Dask. Teams can select the best framework for the job at hand, and quickly set up clusters, so they can test more ideas and develop better models faster.

Domino automatically scales compute clusters based on the workload to simplify provisioning, optimize utilization, and manage computing costs so you can maximize the productivity of your teams and of the return on your computing investments.

Related Resources


A Guide To Enterprise MLOps


Spark, Dask, and Ray: Choosing the Right Framework


Considerations for Using Spark in Your Data Science Stack