Melbourne Bioinformatics (formerly VLSCI) hardware

IBM iDataplex x86 system - Barcoo

  • Peak performance - compute nodes currently performing at 20 teraFLOPS - with Xeon Phi cards running nominally at 1 teraFLOP each
  • 1120 Intel Sandybridge compute cores running at 2.7GHz.
  • 67 nodes with 256GB RAM and 16 cores per node.
  • 3 nodes with 512GB RAM and 16 cores per node.
  • 20 Xeon Phi 5110P cards installed across 10 nodes.
  • Connected to a high speed, low latency Mellanox FDR14 InfiniBand switch for inter-process communications.
  • The system runs the RHEL 6 operating system, a variety of Linux.

Lenovo NeXtScale x86 system - Snowy

  • Peak performance - compute nodes currently performing at 30 teraFLOPS
  • 992 Intel Xeon E5-2698 v3 cores running at 2.3GHz
  • 2 nodes with 512 GB RAM and 32 cores per node
  • 29 nodes with 128 GB RAM and 32 cores per node
  • Connected to a high speed, low latency Mellanox FDR14 InfiniBand switch for inter-process communications.
  • The system runs the RHEL 6 operating system, a variety of Linux.

Computing resources on these supercomputers may be applied for under the Melbourne Bioinformatics Resource Allocation Scheme.

Storage infrastructure (shared by all systems):

  • 700TB GPFS Parallel Data Store
  • 1PB HSM tape system, made available through GPFS