top of page
DISCnet BG .png

HPC and Scalable Programming

This 3 day DISCnet training course will cover the details of High-Performance Computing and Scalable Programming. The module introduces high performance computing (HPC), high-throughput computing (HTC), parallel computing, and cloud computing. 

Objectives:

Using SCIAMA supercomputer access, students will work with real-world applications / simulation codes to familiarise them with the techniques applied.

​

Each day will be split into morning lectures, which will introduce the concepts of HPC and parallel computing, while in the afternoon, we will run a workshop, where students will be provided with specific ‘hands-on’ training of how to use SCIAMA (ICG supercomputer), and how to begin applying their knowledge. The morning session will be split up into two lectures while the afternoon session will consist of tutorials. Students can work in teams during the practicals.

Course Structure:

Day 1: Hardware

​Topics covered include: basics of multi-core and multi-processor machines; overview of different architectures (cloud computing, clusters, supercomputers, GPU farms); importance of networks for data-intensive tasks; basics of shared and distributed memory/ multithreading & multiprocessing ; basics of login, modules, working with schedulers (e.g. submission of jobs). 

 

In the afternoon there will be a hands-on workshop with SCIAMA as a specific example of HPC hardware; we will cover login and submitting/running code (e.g. CosmoMC or Gadget2).

​

Day 2: Data Processing

​Topics covered include: Basics of code development e.g. parallel decomposition, multi-threading; difference between “embarrassingly parallel” processing and shared memory; basic programming environments (MPI, OpenMP).

 

In the afternoon we will have a tutorial workshop on MPI / OpenMP on SCIAMA, and there will be an early evening lab on parallelizing  your own code (assistance provided) or a brief challenge.

​

Day 3: Data Storage

​Topics covered include: Introduction to storage (NFS, GFS) and databases (e.g. SQL); Hadoop, Apache Spark.

 

In the afternoon there will be a hands-on workshop with Hadoop and Apache Spark.

Prerequisites:

Laptop  with ssh client (e.g. Putty on Windows) [remote desktop software (X2Go) / Xserver (optional)]

​

Basic Programming Skills (C/C++, Fortran), no prior knowledge on HPC techniques (e.g. MPI) required

bottom of page