About this Talk
What do deep learning and functional programming have in common?
In this talk we'll explore the basic ideas behind deep learning, and deep learning frameworks like Tensorflow. We'll see that underpinning it all are concepts familiar to functional programmers. We'll then implement a toy deep learning system in Scala, and speculate a bit on the future of deep learning frameworks and the rise of "differentiable programming".
Participants should understand what a derivative and a matrix is, though no recent experience with calculus or linear algebra is necessary.
* composition of derivatives;
* at a high level how neural networks are constructed;
* the role of derivatives in training neural networks;
* how composition of derivatives enables complex neural network structures.
Noel is a consultant at Underscore, where he helps companies succeed with Scala. Prior to Underscore he undertook a PhD in Machine Learning.
BACK TO SCHEDULE