Quantcast
Channel: Mellanox Interconnect Community: Message List
Viewing all articles
Browse latest Browse all 6226

ibping fails between Mellanox MT26428 and qlogic QDR on Centos6.5

$
0
0

Hi All,

I have a mix of Mellanox MT26428 and qlogic IBA7322 adapters all connected to a QLOGIC12300 switch.

Everything used to work fine with Centos6.4, after the upgrade to Centos6.5 the iping between the mellanox and qlogic adapter stopped working. While it is still fine for the qlogic-qlogic and mellanox-mellanox paths.

Below the details for two hosts. Any hint would be much appreciated.

Kind Regards,

  Daniele.

 

HOST1:

[root@hpc-200-06-13-a ~]# lspci | grep Mellanox

02:00.0 InfiniBand: Mellanox Technologies MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE] (rev b0)

[root@hpc-200-06-13-a ~]# ibnodes

Ca      : 0x003048fffff49b1c ports 1 "hpc-200-06-13-b HCA-1"

Ca      : 0x0011750000708278 ports 1 "hpc-200-06-09 HCA-1"

Ca      : 0x00117500007080ca ports 1 "hpc-200-06-11 HCA-1"

Ca      : 0x0011750000702988 ports 1 "hpc-200-06-08 HCA-1"

Ca      : 0x00117500007028c4 ports 1 "hpc-200-06-07 HCA-1"

Ca      : 0x0011750000705360 ports 1 "hpc-200-06-06 HCA-2"

Ca      : 0x0011750000702966 ports 1 "hpc-200-06-05 HCA-2"

Ca      : 0x0011750000706e52 ports 1 "hpc-200-06-04 HCA-1"

Ca      : 0x0011750000702dfc ports 1 "hpc-200-06-03 HCA-1"

Ca      : 0x00117500007041a6 ports 1 "hpc-200-06-02 HCA-1"

Ca      : 0x003048fffff499c4 ports 1 "hpc-200-06-13-a HCA-1"

Switch  : 0x00066a00e3005938 ports 36 "QLogic 12300 GUID=0x00066a00e3005938" enhanced port 0 lid 1 lmc 0

[root@hpc-200-06-13-a ~]# ibstat

CA 'mlx4_0'

        CA type: MT26428

        Number of ports: 1

        Firmware version: 2.7.200

        Hardware version: b0

        Node GUID: 0x003048fffff499c4

        System image GUID: 0x003048fffff499c7

        Port 1:

                State: Active

                Physical state: LinkUp

                Rate: 40

                Base lid: 12

                LMC: 0

                SM lid: 1

                Capability mask: 0x02510868

                Port GUID: 0x003048fffff499c5

                Link layer: InfiniBand

 

HOST2

(only qib0 is connected to the switch)

[root@hpc-200-06-05 ~]# lspci | grep -i infi

01:00.0 InfiniBand: Mellanox Technologies MT27600 [Connect-IB]

03:00.0 InfiniBand: QLogic Corp. IBA7322 QDR InfiniBand HCA (rev 02)

 

[root@hpc-200-06-05 ~]# ibstat

CA 'mlx5_0'

        CA type: MT4113

        Number of ports: 2

        Firmware version: 10.10.1000

        Hardware version: 0

        Node GUID: 0xf4521403001b4b40

        System image GUID: 0xf4521403001b4b40

        Port 1:

                State: Active

                Physical state: LinkUp

                Rate: 56

                Base lid: 1

                LMC: 0

                SM lid: 1

                Capability mask: 0x0651484a

                Port GUID: 0xf4521403001b4b40

                Link layer: InfiniBand

        Port 2:

                State: Initializing

                Physical state: LinkUp

                Rate: 56

                Base lid: 65535

                LMC: 0

                SM lid: 0

                Capability mask: 0x06514848

                Port GUID: 0xf4521403001b4b48

                Link layer: InfiniBand

CA 'qib0'

        CA type: InfiniPath_QLE7340

        Number of ports: 1

        Firmware version:

        Hardware version: 2

        Node GUID: 0x0011750000702966

        System image GUID: 0x0011750000702966

        Port 1:

                State: Active

                Physical state: LinkUp

                Rate: 40

                Base lid: 4

                LMC: 0

                SM lid: 1

                Capability mask: 0x07610868

                Port GUID: 0x0011750000702966

                Link layer: InfiniBand

[root@hpc-200-06-05 ~]# ibnodes -C qib0

Ca      : 0x003048fffff49b1c ports 1 "hpc-200-06-13-b HCA-1"

Ca      : 0x003048fffff499c4 ports 1 "hpc-200-06-13-a HCA-1"

Ca      : 0x0011750000708278 ports 1 "hpc-200-06-09 HCA-1"

Ca      : 0x00117500007080ca ports 1 "hpc-200-06-11 HCA-1"

Ca      : 0x0011750000706e52 ports 1 "hpc-200-06-04 HCA-1"

Ca      : 0x0011750000702988 ports 1 "hpc-200-06-08 HCA-1"

Ca      : 0x00117500007028c4 ports 1 "hpc-200-06-07 HCA-1"

Ca      : 0x0011750000705360 ports 1 "hpc-200-06-06 HCA-2"

Ca      : 0x0011750000702dfc ports 1 "hpc-200-06-03 HCA-1"

Ca      : 0x00117500007041a6 ports 1 "hpc-200-06-02 HCA-1"

Ca      : 0x0011750000702966 ports 1 "hpc-200-06-05 HCA-2"

Switch  : 0x00066a00e3005938 ports 36 "QLogic 12300 GUID=0x00066a00e3005938" enhanced port 0 lid 1 lmc 0

 

Starting the server:

[root@hpc-200-06-05 ~]# ibping -v -d  -S -G 0x0011750000702966

ibdebug: [31145] ibping_serv: starting to serve...

 

 

Launching the ping:

[root@hpc-200-06-13-a ~]# ibping -v -d -G 0x0011750000702966

ibwarn: [12545] sa_rpc_call: attr 0x35 mod 0x0 route Lid 1

ibwarn: [12545] mad_rpc_rmpp: rmpp (nil) data 0x7ffff73b0aa0

ibwarn: [12545] mad_rpc_rmpp: data offs 56 sz 200

rmpp mad data

0000 0000 0000 0000 fe80 0000 0000 0000

0011 7500 0070 2966 fe80 0000 0000 0000

0030 48ff fff4 99c5 0004 000c 0000 0000

0080 ffff 0000 8487 8e00 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000 0000 0000 0000 0000

0000 0000 0000 0000

ibdebug: [12545] ibping: Ping..

ibwarn: [12545] ib_vendor_call_via: route Lid 4 data 0x7ffff73b0e30

ibwarn: [12545] ib_vendor_call_via: class 0x132 method 0x1 attr 0x0 mod 0x0 datasz 216 off 40 res_ex 1

ibwarn: [12545] mad_rpc_rmpp: rmpp (nil) data 0x7ffff73b0e30

ibwarn: [12545] mad_rpc_rmpp: MAD completed with error status 0xc; dport (Lid 4)

ibdebug: [12545] main: ibping to Lid 4 failed


Viewing all articles
Browse latest Browse all 6226


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>