Central DTU HPC Cluster (LSF 10)


The central DTU HPC cluster is a general compute resource for staff and students at DTU to use.

The Central cluster contains a number of different hardware components:

24 x Lenovo ThinkSystem SD530

Each node is configured with:

  • 2x Intel Xeon Gold 6126 (12 core, 2.60GHz)
  • 12 x 192GB / 11 x 384GB / 1 x 768 GB memory
  • FDR-Infiniband
  • 10 GBit-Ethernet
  • 480 GB SSD

In total:

  • 576 Cores

4 x Lenovo ThinkSystem SD530

Each node is configured with:

  • 2x Intel Xeon Gold 6142 (16 core, 2.60GHz)
  • 384 GB memory
  • FDR-Infiniband
  • 10 Gbit-Ethernet
  • 480 GB SSD

In total:

  • 128 Cores

1 x HPE DL385 Gen 10

This node is configured with:

  • 2x AMD EPYC 7551 32-Core Processor
  • 512 GB memory
  • FDR-Infiniband
  • 480 GB-SSD disk
  • please use the queuename ‘epyc’ in your jobscript to use this node

In total:

  • 64 Cores

38 x Huawei XH620 V3

Each node is configured with:

  • 2x Intel Xeon Processor 2650v4 (12 core, 2.20GHz)
  • 256 GB memory
  • FDR-Infiniband
  • 480 GB-SSD disk

In total:

  • 912 Cores

2 x Huawei XH620 V3

Each node is configured with:

  • 2x Intel Xeon Processor 2650v4 (12 core, 2.20GHz)
  • 512 GB memory
  • FDR-Infiniband
  • 480 GB-SSD disk

In total:

  • 48 Cores

7 x Huawei XH620 V3

Each node is configured with:

  • 2x Intel Xeon Processor 2660v3 (10 core, 2.60GHz)
  • 128 GB memory
  • FDR-Infiniband
  • 1 TB-SATA disk

In total:

  • 140 Cores

7 x IBM NeXtScale nx360 M4 nodes

Each node is configured with:

  • 2x Intel Xeon Processor E5-2680v2 (ten-core, 2.80GHz, 25MB L3 Cache)
  • 128 GB memory
  • Scientific Linux 7.3
  • QDR Infiniband interconnect
  • 500 GB internal SATA (7200 rpm) disk for OS and applications

In total:

  • 140 Cores