Subject archive for "machine-learning," page 3

Model Management

High-standard ML validation with Deepchecks

We've blogged before about the importance of model validation, a process that ensures that the model is performing the way it was intended and that it solves the problem it was designed to solve. Validations and tests are key elements to building machine learning pipelines you can trust. We've also talked about incorporating tests in your pipeline, which many data scientists find problematic. The issues stem from the fact that not all data scientists feel confident about traditional code testing methods, but more importantly, data science is so much more than just code. When validating pipelines we need to think about verifying the data integrity, inspecting its distributions, validating data splits, model evaluation, model comparison etc. But how can we deal with such complexity and maintain consistency in our pipelines? Enter Deepchecks - an open-source Python package for testing and validating machine learning models and data.

By Noam Bressler14 min read

Machine Learning

Lightning fast CPU-based image captioning pipelines with Deep Learning and Ray

Translation between Skeletal formulae and InChI labels is a challenging image captioning problem, especially when it involves large amounts of noisy data. In this blog post we share our experience from the BMS molecular translation challenge and show that CPU-based distributed learning can significantly shorten model training times.

By Jennifer Davis14 min read

Machine Learning

Everything You Need to Know about Feature Stores

Features are input for machine learning models. The most efficient way to use them across an organization is in a feature store that automates the data transformations, stores them and makes them available for training and inference.

By Artem Oppermann11 min read

Machine Learning

Transformers - Self-Attention to the rescue

If the mention of "Transformers" brings to mind the adventures of autonomous robots in disguise you are probably, like me, a child of the 80s: Playing with Cybertronians who blend in as trucks, planes and even microcasette recorders or dinosaurs. As much as I would like to talk about that kind of transformers, the subject of this blog post is about the transformers proposed by Vaswani and team in the 2017 paper titled "Attention is all you need" . We will be covering what transformers are and how the idea of self-attention works. This will help us understand why transformers are taking over the world of machine learning and doing so not in disguise.

By Dr J Rogel-Salazar14 min read

Machine Learning

A Hands-on Tutorial for Transfer Learning in Python

Fitting complex neural network models is a computationally heavy process, which requires access to large amounts of data. In this article we introduce transfer learning - a method for leveraging pre-trained model, which speeds up the fitting and removes the need of processing large amounts of data. We provide guidance on transfer learning for NLP and computer vision use-cases and show simple implementations of transfer learning via Hugging Face.

By Dr Behzad Javaheri15 min read

Subscribe to the Domino Newsletter

Receive data science tips and tutorials from leading Data Science leaders, right to your inbox.


By submitting this form you agree to receive communications from Domino related to products and services in accordance with Domino's privacy policy and may opt-out at anytime.