Curry On
London!
July 15-16th, 2019


Julia: A Compiler for the Future
Simon Danisch
Nextjournal


Abstract

Compiler research offers lots of challenges. We want our languages to be faster, more flexible, catching more errors all while being fully dynamic. In addition, Machine Learning researchers realize the need for sophisticated compiler support, raising completely new challenges. This is not exclusive to ML. The scientific computing community, with its need to run complex algorithms at top speed, has been pushing compiler design for a long time and is not slowing down. I will discuss how the combination of Python and C++ was highly successful in these areas, combining great performance with an outstanding usability and how this combination is fundamentally unfit for the challenges of the future. Google’s bold move to rewrite TensorFlow (Python, C++) in Swift is just one symptom, other symptoms include ML frameworks implementing their own IR and compilers on their quest to offer both flexibility and performance (TensorFlow-MLIR, Google-XLA, Pytorch-GLOW). After establishing where Python fails, I will explain how the Julia Language combines the best of Python and C++, in both compiler technology and language design.

Bio

While studying Cognitive Science, Simon developed a great interest for Machine Learning and Computer Vision. During his one-year stay at the the Volkswagen Research lab in San Francisco, he was working on computer vision in C++. Looking for better alternatives to a cumbersome language like C++ or a slow language like Python got him interested in language design. This quickly led him to pick up Julia, where he supported work by the Julia MIT lab and authored a number of open source libraries for plotting, GPU acceleration and Machine Learning. Today, Simon is a researcher at Nextjournal, where he is responsible for making Julia easily accessible.