Palo Alto Networks develops firewall using NVIDIA DPU

Palo Alto Networks has introduced the first virtual next-generation firewall (NGFW) to be accelerated by a DPU to boost the network defence of data centres.

NVIDIA’s BlueField data processing unit (DPU) is used to accelerate packet filtering and forwarding by offloading traffic from the host processor to dedicated hardware that is separate from the server CPU. This lets the solution be delivered to servers with compromising network performance.

Network flows that were previously impossible or impractical to inspect are now possible by screening the relevant parts of the flow and offloading the rest to the DPU.

Palo Alto Networks VM-Series NGFW adopts zero trust network security principles. The DPU acts as an intelligent network filter to parse, classify and steer traffic flows with zero ReCPU overhead, which enables the NGFW to support close to 100Gb/s throughput for typical use cases. This is a 5x performance boost versus running the VM-Series firewall on a CPU alone — and up to 150 percent capex savings compared to legacy hardware.

“As enterprises and telcos build cloud-like data centres, they need the agility and automation of the cloud without compromising performance. Together with NVIDIA, we are turbocharging our VM-Series virtual ML-powered NGFWs,” said Muninder Singh Sambi, Senior Vice President of Products at Palo Alto Networks.

Offloading security in collaborative research applications

In another development, Monash University, Australian Research Data Commons (ARDC) and NVIDIA have come together to explore the role DPUs play in offloading security in collaborative research applications.

Monash hosts 10 NVIDIA BlueField-2 DPUs residing in its private Research Cloud, which is part of the ARDC Nectar Research Cloud, funded under the National Collaborative Research Infrastructure Strategy.

The partners are looking at offloading micro-segmentation onto DPUs, removing the burden of increased security from CPUs, GPUs and other security appliances.

“Micro-segmenting per-research application would ultimately enable specific datasets to be controlled tightly (more appropriately firewalled) and actively and deeply monitored, as the data traverses a researcher’s computer, edge devices, safe havens, storage, clouds, and HPC,” said Steve Quenette, Deputy Director of the Monash eResearch Centre and lead of this project.

“By offloading technology and processes to achieve security, the shadow-cost of security is minimised, while increasing the transparency and controls of each organisation’s SOC. It is a win-win to all parties involved,” he added.

Photo: Mikhail Nilov from Pexels