Why 'the Cloud' Isn't Enough Anymore: Domino Showcases Hybrid/Multi-Cloud MLOps at NVIDIA GTC
By David Schulman2022-09-136 min read
Update 9/23/2022: See what we announced at NVIDIA GTC: "Domino Data Lab Announces New Ecosystem Solutions with NVIDIA to Accelerate Hybrid and Multi-cloud MLOps Journey"
Are AI and machine learning “drifting away from the cloud,” as a recent Protocol article suggests? That’s a popular question these days, one also raised in a recent VentureBeat article, which noted that “one of the cloud ecosystem’s biggest weaknesses is the struggle to address the on-premise opportunity.” Another data point, consistent with these first two: challenging macroeconomic conditions and increased competition have firms like Netflix — once a poster-child for driving scale through cloud adoption — re-evaluating cloud costs (as the Wall Street Journal recently reported).
Domino worked with NVIDIA and Ventana Research on a new white paper that explores the changes ahead. In Top 5 AI Considerations for Chief Data and Analytics Executives, we explored critical factors for next-generation AI infrastructure to effectively govern AI/ML workloads across complex, hybrid/multi-cloud environments. These are also topics that are front-and-center next week in Domino’s sessions at NVIDIA’s free, virtual GTC conference - September 19-22, 2022.
Hybrid/Multi-Cloud Data Science Controls Costs, Boosts Performance
Cloud adoption— particularly the instant availability of scalable, on-demand compute - has driven democratization of machine learning (ML). But as the data science field evolves, the promise of cloud computing alone no longer meets demands of sophisticated companies training computationally-heavy AI and deep-learning models. Kate Kaye from Protocol notes:
“Cloud computing isn’t going anywhere, but some companies are shifting their machine learning data and models to their own machines they manage in-house. Adopters are spending less money and getting better performance.”
Model training compute demand fluctuates wildly based on data volume, model type, and the extent of hyper parameter optimization— as well as the work schedule of data scientists, adds Domino’s Kjell Carlsson. A hybrid approach pairing on-prem compute with the ability to “burst” to the cloud for additional capacity provides ultimate cost-control measures to right-size environments for different workloads.
Data Sovereignty and Model Latency Drive Co-location of Data and Compute
Control over compute cost isn’t the only driver for this trend. Data ingress and egress introduces yet another cost consideration, as extracting and analyzing a petabyte of data could cost as much as $50,000, notes Ventana Research. Beyond this, data sovereignty and privacy regulations often lock data sets within a single geography, while data gravity performance considerations (such as co-locating data and compute for real-time predictions in distributed or edge use cases) need to be considered. As David Menninger, SVP & Research Director, at Ventana Research, puts it:
“Through 2026, nearly all multinational organizations will invest in local data processing infrastructure and services to mitigate against the risks associated with data transfer.”
Domino Showcases Hybrid/Multi-Cloud MLOps at NVIDIA GTC
Last June, Domino announced its new Nexus Hybrid Enterprise MLOps platform, backed by NVIDIA. Since then, Domino has been hard at work making seamless hybrid/multi-cloud MLOps a reality, and we’re eager to reveal exciting innovations at NVIDIA GTC, the leading developer conference for the era of AI and the metaverse.
Domino is a diamond-level sponsor, at NVIDIA GTC, a free, virtual conference from September 19-22. Don’t miss our sessions:
AI Anywhere: How Johnson & Johnson Scales Data Science in a Hybrid and Multi-cloud Computing Environment
Tuesday, September 20 | 2:00 PM - 2:50 PM PDT
Engage directly with experts from Johnson & Johnson, Domino, and NVIDIA in a live session with a chat-based question and answer opportunity. Vikash Varma, Global Head of Cloud Program Office at Johnson & Johnson will dive deep into key considerations in orchestrating data science workloads across hybrid and multi-cloud environments, alongside Thomas Robinson, VP of Strategic Partnerships & Corporate Development at Domino and Rory Kelleher, Director and Global Head of Business Development, Healthcare and Life Sciences at NVIDIA. See the session on-demand here.
Next-Generation AI Infrastructure for the Hybrid Cloud Future (co-presented with NetApp and NVIDIA)
Available On-Demand on September 19
Domino, NetApp, and NVIDIA have been collaborating to bring AI Centers of Excellence and holistic solutions needed for enterprise-wide data science governance, cross-team collaboration, and operational efficiency. We’ll be showcasing our latest solutions to help with seamless data science and data management across hybrid environments, so don’t miss this session with product experts. See the session on-demand here..
Want more of a sneak preview of what we’ll be unveiling at NVIDIA’s GTC? Download Ventana Research’s Top 5 AI Considerations for Chief Data and Analytics Executives to learn about key considerations for scaling enterprise data science in the hybrid/multi-cloud.
David Schulman is a data and analytics ecosystem enthusiast in Seattle, WA. As Head of Partner Marketing at Domino Data Lab, he works with closely with other industry-leading ecosystem partners on joint solution development and go-to-market efforts. Prior to Domino, David lead Technology Partner marketing at Tableau, and spent years as a consultant defining partner program strategy and execution for clients around the world.
Subscribe to the Domino Newsletter
Receive data science tips and tutorials from leading Data Science leaders, right to your inbox.