Shifting the DPU paradigm

In October 2020, NVIDIA introduced the DOCA-supported data processing unit (DPU), a data-centre-infrastructure-on-a-chip architecture that enables breakthrough networking, storage and security performance.

Australia’s Monash University is one of the early adopters testing the DPU. Entelechy Asia interviews Dr Steve Quenette, Deputy Director of the Monash eResearch Centre at Monash University to get his insights on the DPU paradigm shift.

We understand that NVIDIA is partnering with Monash University and Australian Research Data Commons (ARDC) to explore offloading security onto DPUs. Can you share with us the objectives and achievements since the announcement last year?

Quenette: Our objectives are applicable to all market sectors and range from installing and operating DPUs in a test environment, to exploring and communicating our experiences of primitive DPU functions such as encryption in research applications, to exploring and communicating our experiences in using numerous 3rd-party cybersecurity tools on DPUs. We are well past the test phase and are deep into the latter two objectives. We are also exploring our own bespoke work to offload using Morpheus, and HPC-oriented offloads for ConnectX/RoCE based fabrics.

Through ARDC, we are ensuring these learnings inform the future digital research infrastructure landscape in Australia, but also similar research infrastructures across the globe.

We ultimately want to help NVIDIA make DPUs extremely accessible.

Why do we need the microsegmentation-per-research application concept in today’s technological landscape?

Quenette: The risk tolerance of data wholly owned and considered private to an organisation is completely different to the risk tolerance of data that underpins a research collaboration between organisations. We have 100s to 1000s of research use cases in flight at any one time. We want technology that can provide to all those use cases, which means, we have to treat each research application as if it was an organisation. That is, even if one research application is compromised, from the attacker’s viewpoint, they are no closer to compromising any of the other research applications or the organisation itself. This is really important as research applications, especially those that involve people in society, exist both inside and outside of the corporate boundary.

A single BlueField-2 DPU can deliver the same data centre services that could consume up to 125 CPU cores.

The paradigm shift proposed in the partnership focuses on bringing security down to the level of operating systems and applications. Does that mean that today’s systems are not delivering enough security to today’s organisations?

Quenette: The question you are asking is “do we want our application writers to become cybersecurity experts”? And “is the application inside or outside of the application, or both?” For applications solely within corporate boundaries, the world is rather sorted. For those bespoke applications that provide some sort of hero task for the organisation, a halo is needed to protect that hero. Such a halo should be invisible and not weigh the hero down. The hard and soft technology to create such halos is only now coming into existence.

If you don’t have hero projects, that’s fine, but I imagine many will question your innovation environment. This is really about organisations being ambidextrous, and a more appropriate paradigm for the innovation side. Universities are a great living laboratory to experiment and tune the paradigm.

How do NVIDIA technologies help in realising the envisioned paradigm shift?

Quenette: The DPU is where all the halo work, the cybersecurity processing necessary to protect data that span many owners and infrastructures across many providers, can occur in a manner that is invisible to the application and/or the application developer.

How will the paradigm shift affect individuals, given that the pandemic forced organisations to shift to work-from-anywhere (WFA) setups?

Quenette: I am not an expert on WFA. My sense is the scale of WFA is comparable to the number of research collaborations. However, in the research case, the dataset being protected is arguably unique per collaboration, whereas in the WFA case all the interaction is with one central software system. I think you will find that monolithic corporate boundaries will begin to transition to a regime of finer pieces, each with their own risk tolerance (and value to the organisation), and fewer permitted users, regardless of where they are in the world.