navigation

HPC Home

High Performance Computing (HPC) can be accessed by UTS researchers via the eResearch HPCC (High Performance Computing Cluster).

The goals are:

  • provide a shared resource across the UTS research community
  • provide a training or development site for larger HPC projects destined for NCI.

The main page for eResearch is at https://eresearch.uts.edu.au

Access to the Cluster

The eResearch team will need to give you access. Simply email eResearch-IT@uts.edu.au to introduce yourself and your requirements to us. Once you have access read the HPC Getting Started pages.

Cluster Hardware

The HPC consists of:

  • Thirteen nodes for compute, one node for login and a head node.
  • The number of cores in each node is 56. Total number of cores is a bit over 700.
  • Most cores have 256 GB of RAM but some have 512 GB for applications that require more memory. Total distributed memory is about 7.7 TB.
  • Some nodes contain dual Tesla V100 GPU processing units.
  • Most nodes have at least 3 TB and some have 6 TB of fast local attached disk.
  • 700 TB of Isilon storage shared with other eResearch infrastructure.

Acknowledging Use of the HPC

We would appreciate the following text or similar to be used for achknowledgemt. acknowledgement:

“Computational facilities were provided by the UTS eResearch High Performance Computer Cluster.”

This is custom footer