Arm-based server has received a massive endorsement from industry benchmarking group MLCommons. The new server aced the group’s MLPerf benchmarks on inference tests in the data centre category.
The latest benchmarks show that NVIDIA GPU-powered Arm-based servers using Ampere Altra CPUs deliver near-equal performance to similarly configured x86-based servers for AI inference jobs — even outperforming a similar x86 system in one of the tests.
“The latest inference results demonstrate the readiness of Arm-based systems powered by Arm-based CPUs and NVIDIA GPUs for tackling a broad array of AI workloads in the data centre,” said David Lecomber, Senior Director of HPC and tools at Arm.
NVIDIA AI-based computers topped all seven performance tests of inference in the latest tests. These include systems from NVIDIA and partners Alibaba, Dell Technologies, Fujitsu, Gigabyte, Hewlett-Packard Enterprise, Inspur, Lenovo, Nettrix, and Supermicro.
MLPerf’s inference benchmarks are based on today’s most popular AI workloads and scenarios, covering computer vision, medical imaging, natural language processing, recommendation systems, reinforcement learning, and more.
They give users a good gauge of an AI system’s performance to make more informed buying decisions.
The benchmarks are recognised by industry leaders such as Alibaba, Arm, Baidu, Google, Intel, and NVIDIA, making tests transparent and objective.