Graphcore in Supermicro servers

Update: December 10, 2023

Graphcore in Supermicro servers

The AS-1124US-TNRP, which is the first of Supermicro’s highest performance enterprise-class Ultra servers to be approved for use in IPU-POD systems, features the latest third-generation AMD EPYC (TM) processors.

As part of a Graphcore IPU-POD, Supermicro Ultra servers will help innovators push the boundaries of machine intelligence, developing and deploying state-of-the-art models, as well as accelerating today’s most widely used AI applications.

“Supermicro is one of the most trusted names in the business and its high-performance Ultra servers are the perfect complement to Graphcore’s made-for-AI Intelligence Processing Unit and scale-out systems,” says Graphcore’s Tom Wilson.

IPU-M2000s and host servers within an IPU-POD system can be configured in different ratios, helping to optimise TCO around the varying server requirements of specific machine intelligence workloads. Server-intense applications such as computer vision can benefit from a higher server-to-IPU ratio than natural language processing workloads, for example.

“Graphcore has really thought through the architecture of its IPU-POD data center systems, and how to get the best out of different servers for different AI workloads,” said Raju Penumatcha, SVP and Chief Product Officer at Supermicro.

“We expect that Supermicro’s Ultra and other servers, used in conjunction with features like variable IPU-to-server ratio will deliver incredible results.”

IPU-POD is Graphcore’s scale-out machine intelligence solution, based around multiple instances of the IPU-M2000 – the 1 PetaFlop, 1U data center AI blade – plus a range of approved host servers and switches.

IPU-Fabric provides high-bandwidth connectivity, with compiled communications and compute managed by the Poplar software platform. IPU-PODs are currently available as POD4, POD16 and POD64.

The IPU-POD16 Direct Attach (DA) which features 4 IPU-M2000s directly attached to a host server is ideal for AI engineers getting started with IPU evaluation, proof of concept development and pilot. The IPU-POD64 features 16 IPU-M2000s, 1-4 host servers and 2 switches, and is ideal for scale out of larger models and for production workloads.

2021-03-25

David Manners