Content
Foundations of neural networks, including the perceptron, multi-layer perceptrons, activation functions, loss functions, error backpropagation, and the questions of optimization and regularization. Common optimization techniques such as SGD, momentum, and RMSProp. Common regularization techniques such as weight decay, dropout, and Lipschitz constraints. Presentation of popular architectures, such as the convolutional neural network, autoencoders, and recurrent neural networks. Introduction to the PyTorch deep learning framework.