The Matrix Calculus You Need For Deep Learning

Terence Parr, Jeremy Howard, explained.ai, Jul 04, 2018
Commentary by Stephen Downes
files/latex-17BE59DB8766A07658ADAA8522995C53.svg

The mathematics of neural networks is daunting for most people (including me). This paper is an attempt to give readers a grounding in the basics. "This material is for those who are already familiar with the basics of neural networks, and wish to deepen their understanding of the underlying math." You'll have to read this strategically. Start at the bottom, in the section titled 'Notation'. Also review the Wikipedia article on index notation. Then go back, skim the introduction, and begin in-depth reading at the section "Review: Scalar derivative rules." Plan to spend the day. Related: Irving Wladawsky-Berger, Getting on the AI Learning Curve: A Pragmatic, Incremental Approach.

Views: 27 today, 329 total (since January 1, 2017).[Direct Link]
Creative Commons License. gRSShopper

Copyright 2015 Stephen Downes ~ Contact: stephen@downes.ca
This page generated by gRSShopper.
Last Updated: Jul 16, 2018 10:44 a.m.