Enterprise AI Models: How Hardware Advances Are Reshaping the Self-Hosting Calculus
Cohere's recent $6.8 billion valuation signals something interesting about enterprise AI preferences, but not quite what you might expect. While...
2 min read
Marc Austin : Updated on March 19, 2026
SAN JOSE, Calif. — March 2026 — Hedgehog, a provider of cloud-native networking software for AI and cloud infrastructure, today announced support for networking and alignment with the NVIDIA Reference Architecture for NVIDIA Cloud Partners (NCPs). The announcement was made at NVIDIA GTC 2026, where NVIDIA highlighted continued advances in AI infrastructure for cloud service providers.
Hedgehog support for Spectrum-X enables NVIDIA Cloud Partners to deploy AI-optimized Ethernet fabrics based on NVIDIA reference architectures while using cloud-native operational models designed for scale, automation, and multi-tenant environments.
NVIDIA Spectrum-X is an accelerated Ethernet networking platform designed to deliver predictable performance and scalability for large-scale AI workloads. By supporting Spectrum-X and the NVIDIA Cloud Partner reference architecture, Hedgehog enables cloud providers to integrate Spectrum-X–based networking into their infrastructure using declarative, Kubernetes-native networking operations.
“Hedgehog is focused on making modern infrastructure easier to deploy and operate,” said Marc Austin, CEO of Hedgehog. “Support for NVIDIA Spectrum-X and the NVIDIA Cloud Partner reference architecture allows cloud providers to adopt AI-optimized Ethernet fabrics while maintaining consistency with cloud-native operational practices.”
“NVIDIA Spectrum-X brings AI-optimized performance, scalability, and predictability to Ethernet-based cloud infrastructure,” said Amit Katz, Vice President of Networking at NVIDIA. “By supporting NVIDIA Spectrum-X and the NVIDIA Cloud Partner reference architecture, Hedgehog is enabling cloud partners to deploy and operate high-performance AI fabrics more efficiently, using cloud-native approaches that accelerate time to production for AI services.”
Hedgehog’s platform is designed to integrate with Spectrum-X–based deployments to support:
This approach allows NVIDIA Cloud Partners to implement NVIDIA-validated reference architectures while retaining flexibility in how networking is operated and integrated into broader cloud platforms.
NVIDIA Cloud Partners deploy infrastructure to support demanding AI workloads, including training, inference, and AI-as-a-service offerings. Hedgehog support for the NVIDIA Cloud Partner reference architecture is intended to help partners operationalize Spectrum-X–based networking using modern, API-driven workflows.
Support for NVIDIA Spectrum-X and the NVIDIA Cloud Partner reference architecture is available from Hedgehog in Q2, 2026.
Hedgehog provides cloud-native networking software designed to simplify the deployment and operation of modern cloud and AI infrastructure. Built for Kubernetes-first environments, Hedgehog delivers open, programmable networking for cloud providers and enterprises.
Learn more at https://hedgehog.cloud.
Cohere's recent $6.8 billion valuation signals something interesting about enterprise AI preferences, but not quite what you might expect. While...
Why Traditional Networks Fail AI Workloads The billion-dollar bottleneck hiding in your artificial intelligence infrastructure
Dell'Oro Group just released their 4Q 2024 Ethernet Switch - Data Center Report showing record-breaking sales fueled by AI buildouts and a recovery...