[lustre-discuss] Multiple IB interfaces

Alastair Basden a.g.basden at durham.ac.uk
Tue Mar 9 07:18:35 PST 2021


We are installing some new Lustre servers with 2 InfiniBand cards, 1 
attached to each CPU socket.  Storage is nvme, again, some drives attached 
to each socket.

We want to ensure that data to/from each drive uses the appropriate IB 
card, and doesn't need to travel through the inter-cpu link.  Data being 
written is fairly easy I think, we just set that OST to the appropriate IP 
address.  However, data being read may well go out the other NIC, if I 
understand correctly.

What setup do we need for this?

I think probably not bonding, as that will presumably not tie 
NIC interfaces to cpus.  But I also see a note in the Lustre manual:

"""If the server has multiple interfaces on the same subnet, the Linux 
kernel will send all traffic using the first configured interface. This is 
a limitation of Linux, not Lustre. In this case, network interface bonding 
should be used. For more information about network interface bonding, see 
Chapter 7, Setting Up Network Interface Bonding."""

(plus, no idea if bonding is supported on InfiniBand).


More information about the lustre-discuss mailing list