External Lecturer:
Sebastian Goldt
Course Type:
PhD Course
Academic Year:
2020-2021
Period:
January - March
Duration:
36 h
Description:
12x2h lectures + 4x3hrs labs = 36 hours (6 weeks: 25/01 - 05/03)
- Lab 0 (optional): fundamental programming tools and best practice
- Introduction to learning (in high dimensions): Goals of learning; classification vs regression; training vs validation vs testing; linear regression, kernels; fully-connected feedforward networks: representational power; breaking the curse of dimensionality with neural networks?
- Lab 1: neural networks from scratch
- Computer Vision: analysing spatial correlations using convolutions; (the importance of) datasets (CIFAR10 / 100, ImageNet), basic training algorithm: mini-batch SGD; modern architectures (AlexNet, GoogLeNet, ResNet, DenseNet); acceleration techniques (Nesterov, Adam); dropout, batch normalisation.
- Lab 2: computer vision with pyTorch
- Machine Learning for the sciences: solving quantum many-body problems with neural networks (case study); guest lectures (TBC)
- Robustness in Deep Learning: adversarial examples and defences
- Unsupervised learning: GANs and normalising flows; semi-supervised learning.
- Recurrent neural networks: Hopfield networks (joint with guest lectures, TBC); vanishing and exploding gradients in recurrent nets; LSTM
- Lab 3: Generative models for images
- Graph Neural Networks: introduction to GNNs and one application in science, e.g. protein-protein interactions.
- Introduction to reinforcement learning
- Lab 4: Reinforcement learning
- Outlook: From the practice of deep learning to its science; surprises in high dimensions (failure of statistical learning theory bounds), the generalisation puzzle; open problems
Please, notice that this is a course belonging to Data Science Excellence Department programme. MAMA PhD students can plan 33% of their credits (i.e. 50 hrs) from this programme.
Research Group:
Location:
A-128