One of the more exciting things we have under development (or, should we say, in the pipeline) right now is our Pipeline system. Since our mission is to enable CI/CD style development for AI and machine learning, there's a logical next step up from just (well, "just" might be the understatement of the year here) running your code in a repeatable manner with Valohai.
Namely, we want you to be able to take multiple steps, say data validation and preprocessing followed by training followed by e.g. quantization and/or compression for mobile devices or deployment to the cloud. Soon enough, you can even have one model figuring out the best hyperparameters for the training of another using Valohai Pipelines!
As Valohai has been, and will be, API driven from the get-go, our lovely, enterprising users have been able to do this using their own scripts and glue, but as we roll out pipelines as a first-class feature, you'll have a spectacular integrated view of how your data flows through your processes and what has led into the particular model being trained as it has (which, we believe, will be of particular interest to users within more regulated industries such as the financial and medical fields). Pipelines can naturally be triggered and introspected over our API, so whether you already have an external CI/CD system or your enterprise grows to have one, you can rely on being able to still painlessly work with Valohai to manage your computation and data.
Pipelines are currently in closed beta – if you're interested in kicking the tires, get in touch!