📄️ Backpropagation
We define backpropagation as the reverse mode automatic differentiation. Or basically, it is the chain rule applied to neural networks.
📄️ CNN and Image Classification
When we do learn image, the larger image may lead the more parameters and more computation. But we don't actually need to looks at the entire image and we want the incoming weights to focus on local patterns of input image. We would like to use high-level operation convolution to extract features from the image.
📄️ Intro to NN
Introduction to Neural Networks
📄️ Language Model
Learning a good distribution $p(s)$ of sentences. Such problem is called language modeling.
📄️ Multi-Layer Perceptrons
We define layers by some grouped units. Each unit here in graph is a neuron node. The input layer is the first layer, and the output layer is the last layer. The hidden layers are the layers between input and output layers.
📄️ Optimization
How do we train models using gradient? And how do we solve the issues of gradient (i.e. learning rate). One way is to do derivative again. That is, We have Hessian Matrix which is a symmetric matrix.
📄️ CSC413 Neural Networks and Deep Learning
Instructor: Jimmy Ba, Bo Wang