#AI--Mirantis, the Kubernetes-native AI infrastructure company enabling enterprises to build and operate scalable, secure, and sovereign AI infrastructure across any environment, today announced the i...
Mirantis AI Factory Reference Architecture provides a guideline for secure, composable, scalable, and sovereign platforms
CAMPBELL, Calif.: #AI--Mirantis, the Kubernetes-native AI infrastructure company enabling enterprises to build and operate scalable, secure, and sovereign AI infrastructure across any environment, today announced the industry’s first comprehensive reference architecture for IT infrastructure to support AI workloads.
The Mirantis AI Factory Reference Architecture, built on Mirantis k0rdent AI, provides a secure, composable, scalable, and sovereign platform for building, operating, and optimizing AI and ML infrastructure at scale. It enables:
“We’ve built and shared the reference architecture to help enterprises and service providers efficiently deploy and manage large-scale multi-tenant sovereign infrastructure solutions for AI and ML workloads,” said Shaun O’Meara, chief technology officer, Mirantis. “This is in response to the significant increase in the need for specialized resources (GPU and CPU) to run AI models while providing a good user experience for developers and data scientists who don’t want to learn infrastructure.”
With the reference architecture, Mirantis addresses complex issues related to high-performance computing that include remote direct memory access (RDMA) networking, GPU allocation and slicing, sophisticated scheduling requirements, performance tuning, and Kubernetes scaling. The architecture can also integrate a choice of AI Platform Services, including Gcore Everywhere Inference and the NVIDIA AI Enterprise software ecosystem.
Cloud native workloads, which are typically designed for scale-out and multi-core operations, are quite different from AI workloads, that can require turning many GPU-based servers into one single supercomputer with aggregated memory that requires RDMA and ultra-high performance networking.
The reference architecture leverages Kubernetes and supports multiple AI workload types (training, fine-tuning, inference) across: dedicated or shared servers; virtualized environments (KubeVirt/OpenStack); public cloud or hybrid/multi-cloud; and edge locations. It addresses the novel challenges related to provisioning, configuration, and maintenance of AI infrastructure and supporting the unique needs of workloads, including high-performance storage, and ultra-high-speed networking (Ethernet, Infiniband, NVLink, NVSwitch, CXL) to keep up with AI data movement needs. They include:
The Mirantis AI Factory Reference Architecture is designed to be composable so that users can assemble infrastructure from reusable templates across compute, storage, GPU, and networking layers tailored to their specific AI workload needs. It includes support for NVIDIA, AMD, and Intel AI accelerators.
Access the complete reference architecture document, along with more information.
About Mirantis
Mirantis is the Kubernetes-native AI infrastructure company, enabling organizations to build and operate scalable, secure, and sovereign infrastructure for modern AI, machine learning, and data-intensive applications. By combining open source innovation with deep expertise in Kubernetes orchestration, Mirantis empowers platform engineering teams to deliver composable, production-ready developer platforms across any environment - on-premises, in the cloud, at the edge, or in data centers. As enterprises navigate the growing complexity of AI-driven workloads, Mirantis delivers the automation, GPU orchestration, and policy-driven control needed to cost-effectively manage infrastructure with confidence and agility. Committed to open standards and freedom from lock-in, Mirantis ensures that customers retain full control of their infrastructure strategy.
Mirantis serves many of the world’s leading enterprises, including Adobe, Ericsson, Inmarsat, PayPal, and Societe Generale. Learn more at www.mirantis.com.
Fonte: Business Wire
Alaa Abdul Nabi, Vice President, Sales International at RSA presents the innovations the vendor brings to Cybertech as part of a passwordless vision for…
G11 Media's SecurityOpenLab magazine rewards excellence in cybersecurity: the best vendors based on user votes
Always keeping an European perspective, Austria has developed a thriving AI ecosystem that now can attract talents and companies from other countries
Successfully completing a Proof of Concept implementation in Athens, the two Italian companies prove that QKD can be easily implemented also in pre-existing…
Hewlett Packard Enterprise (NYSE: HPE) and Juniper Networks, Inc. (NYSE: JNPR) today announced they have reached an agreement with the U.S. Department…
Chai Discovery, which builds frontier artificial intelligence to predict and reprogram the interactions between biochemical molecules, today announced…
Roblox Corporation (NYSE: RBLX) today announced that it will report the company’s second quarter 2025 financial results before the opening of the U.S.…
#AI--LambdaTest, a unified agentic AI and cloud engineering platform, has announced it has become a strategic sponsor of Appium, the world’s most widely…