Hitachi Vantara and physical AI

We had the opportunity to interview Jason hardy, Hitachi Vantara’s CTO for AI, and the conversation started by looking at VSP One and disaggregation and ended up talking about AI agent-infused robots. Physical AI. Read on to see how we got there.

Blocks and Files: Hitachi Vantara’s VSP One storage line is a classic controllers-integrated-with-storage-shelves architecture, and this contrasts with the disaggregated compute and storage nodes exemplified by VAST Data’s DASE (Disaggregated Shared Everything) architecture. This has been taken up by HPE with its Alletra MP 10000 line, Pure with its FlashBlade//EXA, and NetApp’s AI-focussed ONTAP architecture revision. Is this storage array technology direction of interest to Hitachi Vantara?

Jason Hardy: We are definitely thinking about what those platforms, what VAST has created, even what Pure has done a bit of now.

What that actually means, and how using things like VSP One Block, and how that can benefit from it. What does NVMe over fabric look like for something like that? So utilising what we’re really good at and then innovating forward from there.

“it’s something that when we look at the entire market and what’s trending and what is the demand going to push from an AI perspective from all these highly parallelized workloads is how can we create something of value using our foundation, but then really delivering on what the customers need, especially at scale.’

Blocks and Files: “So it’s conceivable to think of a mid range vs P one block piece of hardware at the moment and thinking of that being separated out into controller nodes and storage nodes?”

Jason Hardy: I actually take it one step further. So what VSP One Block is really good at is providing that block capability. A lot of these, especially looking from an AI workload perspective, they require a file system, the scale-out file system functionality. So from my standpoint, VSP One Block plays a big piece of that, but it’s just one piece of that because of object, and how object is becoming a primary platform, an active contributor to the AI ecosystem. 

So how do we bring … that in? And then also looking at how while block plays a core role of it, we still need file system functionality and things like that. On top of it, it turns out most GPU servers don’t have Fibre Channel ports and things like that. So obviously NVMe over TCP is an option for that. But really, when you start looking at multiple compute requirements, then you have the need to share data. And having just one affinity in the traditional sense to a server doesn’t really benefit the pool of resources that customers are purchasing and growing into to support a lot of their AI demands. So from our perspective, block plays a big piece of it mid-range, but there’s a lot more to it.

Blocks & Files: Hitachi Vantara supports GPU Direct with VSP One and that’s used in its broader Hitachi iQ AI infrastructure platform. Will Hitachi extend this protocol support?

Jason Hardy: S3 RDMA, or GPU Direct S3, is another thing. … that’s something that is a very high priority for us as we continue to work with Nvidia from a partnership perspective. We’re on our third generation of VSP One Object and we were the first to release in an on-prem object store Iceberg tables. So no one else has that functionality in the object store.

Blocks and Files: Do you see Hitachi v’s main interest in AI being an inference rather than training?

Jason Hardy: For us it’s all aspects of it and that’s because it’s important. So obviously training was a big push when generative ai AI came out and that was creating the large language models. AI is going to [and]  still requires a lot of LLMs, but it’s going to require SLMs as another piece of it.

Blocks  and Files: Small language models.

Jason Hardy: Exactly. And then as we move into the physical AI realm, which is this next wave after Agentic AI. We’re heavily focused on the physical AI realm because that is what Hitachi Limited Is a big piece of. And that was what our latest announcement was; how we’re building out and creating physical AI capabilities to support our industrial, our own internal industrial business units, as well as what our customers are asking for.

Physical AI is really going to transform a lot of the manufacturing space. It’s more than just self-driving cars.

Blocks and Files: How do you see it transforming things there?

Jason Hardy:  if we pick out the manufacturing robot process, a lot of that is based on basic computer vision capabilities that; hey, I can sort something and understand if there’s a type A widget, this is a B, or this is a damaged widget and this is a not damaged widget. And then being able to work inside of that. What physical AI is going to do is basically now blend the agentic capabilities, basically giving agency to these systems into the physical space to where now it’s more than just having a picker. It’s now having humanoid type platforms.

Blocks and Files: I see. It’s like this; current industrial robots are great at sticking a screw in a car chassis, doing one 10,000 times a day. But they can’t put a nut on the end of it.

Jason Hardy: Correct. And now what happens is; we’re blending a lot of this together so that they are more aware of their surroundings and what they operate inside of. You can give them more human-like dexterity. But most importantly is; you can train them in virtual space with photorealistic surroundings because that’s very important to how that robot brain you’re training can then be put into a physical robot and now have the ability to understand its surroundings, its environment, very complex tasks and adapt as it needs to work through those tasks.

Blocks and Files: Okay. So let’s say I’ve got a car manufacturing product line at the moment with 20 robots doing the various things I did at the moment. If you wanted to change that to a new model or change it to a variation on the existing model, you’re probably going to have to do a software upgrade. That’s to all the computers behind it, whereas what you’re saying is …

Jason Hardy: Robotics will have to change. So, having just an articulating arm, like you said, can put a screw in. That’s a robot that does one thing and it’s been designed and built to do that one thing. It’s now how do we take that further? And it will be even a transformation in the robotics that are even incorporated into that. So it’ll be much as manufacturing processes are retrofit or improved as they go through normal maintenance or as they go through expansion to increase capacity, there will then also be a, hey, we have an investment of new robotics that come in, or humanoid style robotics that have a bit more mobility than just a picker or an articulating arm doing just a screw.

Blocks and Files: I could think of perhaps a PC motherboard. So an assembly line for it’s staffed by human workers and they take in a slab of a wafer at the end and at the back out comes the mother board and people have picked up components from trays oriented them, oriented the motherboard, put them in place, sold them. And you have a line of these people gradually adding more and more components to the motherboard. So I could envisage perhaps a smart robot doing all those things.

Jason Hardy: Exactly at lightning speed or lightning beeping at an accelerator. And 24 hours a day of course. 

And they don’t get tired.

Blocks and Files: And is this real? Have you got prototypes of this kind of thing?

Jason Hardy: We’re in the middle now of doing a lot on our manufacturing process to get to that point. So we’ve been designing assembly lines for a very long time, for our own purposes, for other people and for our customers. So more to come.

Blocks and Files: if I could think of an average assembly line at the moment. with humans doing the stage things I’ve talked about, what would be the advantages of replacing that with a robotic system? What kind of things would excite manufacturing?

Jason Hardy: Higher capacity, higher output, more products per hour and a higher quality output. … The yield rate will be higher and you will have less failures through physical installation problems. You are now mitigating a lot of those human0created problems. And that’s just one piece of it, but it is improved efficiency. It is that 24 x 7 cycle.

What you’re also walking into now is the co-working of humanoids and humans saying, for example: “Hey, I’m new to the building. Where’s lunch at?” Or: “Hey, help me mail something.”

Whatever, you can interface with a robot now who understands you, the employee or the person, its surroundings and can guide you into a location or can help you with a task. So it’s a lot about bringing the autonomy that agentic AI has creating now into the physical ai, into the physical space, wrapping that inside of a package – like a robot, and then having that fulfil a task for you, being an assistant, a helper or being a piece of your manufacturing line.

Blocks and Files: No other storage company I talked to has been going on about this topic, but then no other storage company I talked to Is a major, part of the Hitachi group.

Jason Hardy: Exactly. And we have a very different perspective on this because our business is all of this. It is everything from manufacturing, a raw product all the way out to the IT systems and the digital transformation that these raw product manufacturers have to go through. Our own processes around how we manufacture trains or how we manufacture energy components are going through this transformation. So we are literally customer zero where we eat our own dog food or drink our own champagne.

That’s why we’re investing so much into it, because we ourselves see a lot of value from this, and our customers will obviously benefit from that as well.