Machine Learning with Julia

Thursday April 6th, 7:00pm

Machine Learning, Visualization, Distributed and High-Performance Computing in Julia


Free

Machine Learning with #Julialang

Julia was developed in MIT in 2012, to take advantage of the recent developments in compiler technology and computing. Julia is a high-level just-in-time compiler designed for high-performance computing. It runs as fast as statically-compiled languages such as C/Fortran but uses a very high-level dynamic language. It supports modern language features such as: map-reduce style parallelism and distributed computation, coroutines, automatic code generation, metaprogramming, dynamic type system, and multiple dispatch. It aims to solve the two-language problem by having a user-friendly, interactive, and dynamic language but statically compiling the code on the fly for optimal performance. In Julia, most of the math/stat/scientific/machine-learning libraries and packages are written in Julia itself which makes the entire workflow easy to explore, modify, and maintain.

In this talk, I will discuss my recent explorations of using Julia to implement machine learning algorithms exploiting various Julia features including most importantly parallelism, distributed computations, and meta-programming. The talk covers the following areas: parallel weight update across layers by Direct Feedback Alignment (DFA) for deep learning architectures, population ensembling and evolution of machine learning models, as well as performance tracking and visualization.

Paulito P. Palmes is currently a Research Scientist in IBM Ireland’s Dublin Research Lab (DRL) with research interests in the areas of data mining, deep learning, cognitive computing, and biomedical engineering.

More info here