Blog / New Features for Optimizing MLOps Efficiency and Resource Utilization
New Features for Optimizing MLOps Efficiency and Resource Utilization

New Features for Optimizing MLOps Efficiency and Resource Utilization

Tarek Oraby

We are excited to announce significant enhancements to Valohai's MLOps platform, focusing on further boosting the efficiency and speed of ML workflows. These improvements target critical areas such as model iteration speed, efficient resource utilization, and dataset management, directly addressing the challenges faced by data science teams in accelerating time-to-market and optimizing operational costs.

Resource alerts for cost savings

A key highlight of this release is the new notification system that alerts users when their ML workloads underutilize resources. This feature actively monitors the usage of CPU, GPU, and memory, sending alerts (in-app or via email and Slack) when a machine operates below 50% capacity. This feature empowers teams to optimize resource usage and reduce operational costs by identifying underutilized resources and reallocating them to other tasks.

The view of underutilization alerts, peak use per machine, and other details in the Valohai MLOps platform. The view of underutilization alerts, peak use per machine, and other details in the Valohai MLOps platform.

Auto caching for faster iterations

Moreover, to cater to the need for faster model experimentation and iteration, Valohai now supports the automatic caching of outputs from past steps in the CI/CD pipeline. This feature ensures that as long as the input data, code, and input parameters of a pipeline step remain unchanged, the results of that step can be instantly reused in future pipeline runs. This feature eliminates the need for redundant recomputation, hence allowing data science teams to focus on developing high-quality models and delivering business value.

Advanced dataset management

Last but not least, we are introducing advanced dataset management features that allow users to effortlessly manage, search, and utilize large numbers of files. Users can now tag files with key-value pairs to categorize and organize data more effectively. This tagging system integrates seamlessly across Valohai's user interface, enabling sophisticated filtering options that combine tags using logical AND or OR operations. These enhancements simplify the organization and retrieval of files, making it easier for data science teams to find and use the data they need for their ML workflows.

These new capabilities further empower data science teams to streamline their workflows, minimize operational costs, and accelerate the delivery of business value. Valohai's commitment to delivering state-of-the-art solutions enables teams to achieve their objectives with unprecedented efficiency and ease.

Book a meeting with our Customer Team here and get started with the Valohai MLOps platform.

Start your Valohai trialTry out the MLOps platform for 14 days