[lustre-discuss] old Lustre 2.8.0 panic'ing continously

Andreas Dilger adilger at whamcloud.com
Fri Mar 13 02:47:23 PDT 2020

One thing to check if you are not seeing any benefit from running e2fsck, is to
make sure you are using the latest e2fsprogs-1.45.2.wc1.

You could also try upgrading the server to Lustre 2.10.8.

Based on the kernel version, it looks like RHEL6.7, which should still work with 2.10
(the previous LTS branch), but has a lot more fixes than 2.8.0.

Cheers, Andreas

On Mar 5, 2020, at 00:48, Torsten Harenberg <harenberg at physik.uni-wuppertal.de<mailto:harenberg at physik.uni-wuppertal.de>> wrote:

Dear all,

I know it's dared to ask for help for such an old system.

We still run a CentOS 6 based Lustre 2.8.0 system

It's out of warrenty and about to be replaced. The approval process for
the new grant took over a year and we're currently preparing an EU wide
tender, all of that takes and took much more time than we expected.

The problem is:

one OSS server is always running into a kernel panic. It seems that this
kernel panic is related to one of the OSS mount points. If we mount the
LUNs of that server (all data is on a 3par SAN) to a different server,
this one is panic'ing, too.

We always run file system checks after such a panic but these show only
minor issues that you would expect after a crashed machine like

[QUOTA WARNING] Usage inconsistent for ID 2901:actual (757747712, 217)
!= expected (664182784, 215)

We would love to avoid an upgrade to CentOS 7 with these old machines,
but these crashes happen really often meanwhile and yesterday it
panic'ed after only 30mins.

Now we're running out of ideas.

If anyone has an idea how we could identify the source of the problem,
we would really appreciate it.

Kind regards


Dr. Torsten Harenberg     harenberg at physik.uni-wuppertal.de<mailto:harenberg at physik.uni-wuppertal.de>
Bergische Universitaet
Fakultät 4 - Physik       Tel.: +49 (0)202 439-3521
Gaussstr. 20              Fax : +49 (0)202 439-2811
42097 Wuppertal

lustre-discuss mailing list
lustre-discuss at lists.lustre.org<mailto:lustre-discuss at lists.lustre.org>

Cheers, Andreas
Andreas Dilger
Principal Lustre Architect

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.lustre.org/pipermail/lustre-discuss-lustre.org/attachments/20200313/b517de81/attachment.html>

More information about the lustre-discuss mailing list