Wedoany.com Report on Mar 11th, Nexthop AI, based in Santa Clara, USA, recently announced the completion of a $500 million Series B funding round, valuing the company at $4.2 billion. The company also launched a new AI data center switching platform and a "disaggregated spine" architecture targeting hyperscalers and NeoCloud operators. This funding round was led by Lightspeed Venture Partners, with participation from Andreessen Horowitz, Altimeter, and existing investors. The raised capital will be used to develop next-generation AI networking systems optimized for hyperscale infrastructure.
Simultaneously, the company introduced a series of switch product portfolios for cloud and AI data center scale-out, cross-scale, and front-end networking. These platforms support open network operating systems like SONiC and FBOSS and offer turnkey software options through Nexthop's SONiC-based network OS. Nexthop states that these systems focus on improving deployment efficiency, energy savings, and operational transparency for large AI clusters.
The core of the launch is Nexthop's disaggregated spine architecture, co-developed with hyperscale partners. This design deconstructs traditional chassis-based systems into functional tiers, including a cross-scale leaf layer for data center fabrics and a spine layer for data center interconnect. Nexthop claims this approach can reduce costs and power consumption by approximately 30% compared to traditional chassis architectures, while also supporting open network software and hyperscale-driven customization.
The new product platforms include several switches such as the NH-4010, NH-4220, and NH-5010. These are based on Broadcom's Tomahawk 5, Tomahawk 6, and Qumran 3D chips, supporting throughput from 51.2 Tbps to 102.4 Tbps, and are designed to achieve 15–20% energy savings. The systems also integrate real-time telemetry, congestion management, load balancing, and optical monitoring capabilities. The platforms have already begun shipping to leading hyperscale operators.
"Our continued focus on innovation and deep customer collaboration has driven the development of highly customized JDM solutions for the largest operators and turnkey platforms for NeoClouds," said Anshul Sadana, Founder and CEO of Nexthop AI.
Nexthop AI's funding and product launch reflect the growing demand for new network architectures in AI cluster design. As GPU cluster sizes increase, hyperscale operators are increasingly exploring cross-scale and disaggregated network designs to manage power consumption and traffic. Nexthop's emphasis on open network software and collaboration with hyperscalers mirrors industry trends in AI infrastructure. Suppliers like Broadcom, NVIDIA, and Arista Networks have also launched platforms supporting high-speed Ethernet fabrics.









