HelixML Brings Enterprise-Grade CI/CD to Private GenAI as Global Companies Rush to Repatriate AI Infrastructure
Major Banks, Telcos, and CERN Already Adopting Helix's Revolutionary Private GenAI Platform
Salt Lake City, UT (KubeCon + CloudNativeCon 2024) - November 12, 2024 - HelixML, the pioneering force in local GenAI solutions, today announced the immediate availability of Helix 1.4, introducing the industry's first complete CI/CD framework for private GenAI applications, as major enterprises worldwide race to bring AI capabilities back within their own infrastructure. The release comes amid rapid adoption by global financial institutions, telecommunications giants, and research organizations like CERN, who are leveraging Helix to deploy secure, locally-hosted AI solutions at scale.
“Increasing national and regional regulation on end-user data, and the shift back to the private cloud in enterprises are two mega-trends that are driving the need for fully private GenAI solutions,” said Luke Marsden, co-founder and CEO, HelixML. “Today’s Helix 1.4 release which adds CI/CD capabilities completes the foundational software & DevOps stack needed to deliver reliable, scalable GenAI applications fully behind the firewall.”
Secure, Local GenAI Stack
New features in Helix 1.4 include:
Enterprise-Grade Testing Framework: Native support for AI application testing (evals) with deep debugging capabilities;
GitOps-Native Deployment: Full Kubernetes operator and Flux integration for automated deployment;
Enhanced Model Support: Extended context length support for Llama 3.1/3.2, Phi 3.5, Mixtral, Gemma 2, Aya from Cohere, and more;
Production-Ready Integrations: Battle-tested JIRA integration and multi-lingual RAG capabilities; and,
Hardened Scheduler: Completely rebuilt for higher-scale concurrent GPU workload management.
Read more about: Helix 1.4 Product Announcement
The CI/CD for cloud-native GenAI reference architecture which will be showcased at KubeCon, demonstrates how users can combine Helix, Kubernetes, GitHub/GitLab, Flux and NVIDIA GPUs to create a complete software development lifecycle for GenAI apps, including testing (using the LLM-as-a-judge pattern) and continuous delivery with GitOps for GenAI, all running fully locally on your own private secure infrastructure or cloud VPC.
"You wouldn't ship software without tests, and you shouldn't ship AI Apps without evals," says Bartosz Świątek, Senior Engineer at AWA Network, an early Helix customer already using the platform's new testing capabilities. "Helix's CI/CD support has been crucial for ensuring our AI applications meet enterprise standards."
Major Enterprise Momentum
One of the largest banks in the Arabian Gulf selected Helix over NVIDIA's offering, citing its superior simplicity and OpenShift integration
A leading European telco deployed Helix as their core RAG and integration layer for Llama 3
CERN successfully integrated Helix with SLURM to manage thousands of GPUs
A major East Asian telco is deploying Helix to convert their GPU infrastructure into a private OpenAI alternative
HelixML will showcase their CI/CD for GenAI reference architecture at KubeCon in Salt Lake City, November 12-15, demonstrating how enterprises can leverage Kubernetes, GitHub/GitLab, Flux, and NVIDIA GPUs to create a complete AI application lifecycle - from development through testing and deployment. Come and find Chris Sterry if you are at KubeCon!
Read more about: Helix revenue and pipeline.
Supporting Resources
CI-CD for GenAI cloud native reference architecture: https://github.com/helixml/genai-cicd-ref
About HelixML, Inc.
Founded in 2023, HelixML is revolutionizing enterprise AI by making open source models work at scale. The company's mission is to provide the best software for running GenAI on private infrastructure, enabling organizations to maintain full control over their AI capabilities and data. Unlike hyperscale AI providers, Helix doesn't want your data - we want you to keep it. For more information, visit helix.ml.
# # #
Contact:
Chris Sterry
chris@helix.ml