Slot 2

Distributed memory programming and algorithms
Johannes Langguth, Simula Research Laboratory, Norway

Abstract

Distributed memory computers provide bandwidth, processing, and memory scaling capabilities beyond what can be achieved via coherent shared memory. An important consideration in using distributed memory computers effectively is to keep communication costs low, since processing speeds are outpacing communication rates.

Two important models for programming distributed memory are message passing and RMA (Remote Memory Access). RMA comes in many forms, and benefits from global address space communication, that is generally supported by modern network hardware. RMA is employed in PGAS (Partitioned Global Address Space) models which adds global pointers, and optionally, remote procedure call. These two capabilities play an important role in reducing communication costs, especially for fine grained and irregular communication patterns.

The lectures will cover message passing and PGAS programming via two libraries, respectively, MPI and UPC++. The goal of the lectures is to build a solid grounding in distributed memory programming and the performance tradeoffs in efficient implementation. Algorithmic studies will be presented. Hybrid hierarchical models will also be discussed, which compose distributed memory programming with programming at the node, e.g. multithreading. The emphasis will be on maintaining low communication costs, as opposed to optimizing computational performance, which is another topic for study.

Bio

Johannes Langguth is a research scientist at Simula reseach laboratory, Oslo, Norway. He received his PhD in computer science from the University of Bergen, Norway in 2011, and master degrees in computer science and economics from university of Bonn, Germany. After a postdoctoral appointment at ENS Lyon, France, he joined Simula in 2012. His research focuses on the design of discrete algorithms for irregular problems on parallel heterogeneous architectures such as multi-core CPUs and GPUs, and their applications in scientific computing, graph analytics, machine learning, computational social science, and high-performance codes for cardiac electrophysiology.


  Back to course info