Programming Languages and Compilers for Machine Learning Seminar

Machine learning brings interesting new challenges for programming languages research in terms of making machine learning applications execute more efficiently but also in new different programming paradigms that are specifically tailored to machine learning tasks. In this seminar, we are reading recent papers on programming systems for machine learning, mostly deep learning. The papers we will read are about:

It is beneficial to have some background in compilers (i.e. by having passed the compilers core course). You don't need to have a background in machine learning. We will cover the necessary foundations in the seminar.

People

Sebastian Hack

Organization

Language English
Participants 12 / 12 (seats taken / maximum seats)
Waiting list 0 (please attend the Preparatory Meeting)
Preparatory Meeting Thursday, 11 18.04.19, 16:00 s.t., E1.3 room 401
Weekly Meeting Thursdays, 16:00 s.t., E1.3 room 401
Prerequisites Preferably, you have taken part in the compiler construction course.

Registration

Please use our seminar system. Note that you still have to register for the Seminar in the LSF until May 06th to get a certificate for the seminar.

Modus Operandi

A paper will be assigned to each participant. We will have weekly meetings during the semester in which we will discuss one of the assigned papers. The discussion will be managed by the student to whom the paper was assigned. She/he is responsible for giving a short summary on the paper and for structuring the following discussion.

Weekly Summaries

Every week each student has to write a plain text summary (max. 500 words) on the week's paper. This summary should include open questions and is to be submitted to Tina Jung three days before the corresponding meeting (23:59).

The submitted files must follow the naming scheme:

<two-digit-paper-number>_<matriculation-number>.txt

The summaries of all participants will be made available and can be used by the moderator to structure the discussion in the following meeting.

Each participant is allowed to drop two summaries without any particular reason. In case you drop a summary, please send a short mail telling so.

Final Talks

At the end of the semester each participant will give a presentation 30 minutes (25 min talk + 5 min questions) about her/his paper.

Dates

Sessions

Date Moderator Paper
02 May Tajbeed Chowdhury Vasilache et al. Tensor Comprehensions
09 May Valentin Seimetz Kim et al. A Code Generator for High-Performance Tensor Contractions on GPUs
16 May Matthis Kruse Kjolstad et al. The Tensor Algebra Compiler
23 May Leon Thiele Baydin et al. Automatic Differentiation in Machine Learning: a Survey
30 May - Public holiday
06 June Bohdan Liesnikov Wang et al. Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator
13 June Daniel Spaniol Michael Innes. Don’t unroll adjoint: Differentiating SSA-Form Programs
20 June - Public holiday
27 June Kallistos Weis Singh et al. An abstract domain for certifying neural networks
04 July Lukas Kirschner Gopinath et al. Symbolic Execution for Deep Neural Networks
11 July Stefan Oswald Sun et al. Concolic Testing for Deep Neural Networks
18 July Tina Jung How to give a good presentation

Final Talks

Date Time Speaker
23 July 11:00 Leon Thiele
23 July 11:30 Bohdan Liesnikov
23 July 12:00 Break
23 July 13:00 Daniel Spaniol
23 July 13:30 Tajbeed Chowdhury
20 August 11:30 Valentin Seimetz
20 August 12:00 Matthis Kruse
20 August 12:30 Kallistos Weis
20 August 13:00 Break
20 August 14:00 Stefan Oswald
20 August 14:30 Lukas Kirschner

Papers

All papers are available from the university network (how to connect to the university network from home). A publicly available version is linked below whenever available.

Tensor DSLs

  1. Abadi et al. TensorFlow
  2. Vasilache et al. Tensor Comprehensions
  3. Kim et al. A Code Generator for High-Performance Tensor Contractions on GPUs
  4. Kjolstad et al. The Tensor Algebra Compiler
  5. Kjolstad et al. Sparse Tensor Algebra Optimization with Workspaces
  6. Lattner et al. MLIR

Differentiable Programming

  1. Baydin et al. Automatic Differentiationin Machine Learning: a Survey.
  2. Wang et al. Backpropagation with Continuation Callbacks: Foundations for Efficient and Expressive Differentiable Programming
  3. Wang et al. Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator
  4. Michael Innes. Don’t unroll adjoint: Differentiating SSA-Form Programs
  5. Li et al. Differentiable Programming for Image Processing and Deep Learning in Halide

Verification of Neural Networks

  1. Singh et al. An abstract domain for certifying neural networks
  2. Gopinath et al. Symbolic Execution for Deep Neural Networks.
  3. Sun et al. Concolic Testing for Deep Neural Networks.

Background Material