70010 Deep Learning

70010 Deep Learning
Overview:

Note that this course will be held online as a combination of pre-recorded lectures, weekly Q&A sessions, tutorials on Teams and individual coding projects

Q&As and tutorials have been timetabled for 2 hours per week:

Tue 10:00-11:00 Q&A session
Tue 11:00-12:00 Tutorial

However, not all timetabled slots will be used every week so please check the timetable below for more information. All notes, tutorials and coursework (including coursework hand-out/in dates) can be found on Materials and CATe. Revision notes on essential machine learning can be found here based on Tom Eccles’ original notes.

Questions can be discussed on the course’s Piazza page (70010/460 Deep Learning [Spring 2021])

Timetable

Week 1 (starting 11th January)   
     
  No lectures, no tutorials.  
     
Week 2 (starting 18th January)   
pre-recorded01 Logistics
02 The curse of dimensionality
03 Convolutions
04 Convolutional Neural Networks
LectureKainz
papers to read and notesN00 Admin Notes
N01 Convolution Notes
N01a Convolution slides
19/0110:00Zoom
recording
Q&AKainz, Vlontzos
19/0111:00MS Teams
T01 signals
T02 padding and strides
TutorialBudd, Di Palo & team
coursework prep. Introduction to PyTorch I
jupyter notebook I
Introduction to PyTorch II
jupyter notebook II
practicalPace
   
Week 3 (starting 25th January)   
pre-recorded05 Equivariance and Invariance
06 LeNet
07 AlexNet
08 VGG
LectureKainz
papers to read and notesLeNet
AlexNet
VGG
N02 Equivariance and Invariance
N03 LeNet
N04 AlexNet
N05 VGG
26/0110:00MS Teams
recording
Q&AKainz, Vlontzos
26/0111:00MS Teams
T03 CNNs
TutorialSrinivasan & team
deadline: 05 Feb 2021, 19:00 Coursework Task 1assessedVlontzos
Quiz test your knowlege here    
Week 4 (starting 01st February)   
pre-recorded09 Network in Network and Inception
10 BatchNorm
11 ResNet, DenseNet and beyond
12 Activation functions
13 Loss functions
14 Data Augmentation and case study
LectureKainz
papers to readInception
ResNet
BatchNorm
N06 Inception
N07 BatchNorm
N08 ResNet
N09 Activation functions
N10 Loss functions
N11 Augmentation
02/0210:00MS TeamsQ&AKainz, Reynaud
02/0211:00MS Teams
T04 Covariate shift
T05 Batch-norm
TutorialReynaud, Pace & team
deadline: 05 Feb 2021, 19:00 Coursework Task 1assessedVlontzos
     
Week 5 (starting 08th February)   
pre-recorded15 Generative models
16 VAEs
17 GANs
17a GANs advanced
LectureLi
papers to readVAEs
GANs
N15 intro slides
N16 VAEs slides
N17 GANs slides
N17a slides
09/0210:00MS Teams
recording
Q&ALi, Schmidtke
09/0211:00MS Teams
T06 VAEs, GANs
TutorialSchmidtke & team
deadline: 23 Feb 2021, 19:00 Coursework Task 2assessedCoppock, Spies
 Quiz test your knowlege here  
Week 6 (starting 15th February)   
pre-recorded18 RNN basics
19 RNN applications
20 Attention & Transformer basics
21 Transformer applications & advanced
LectureLi
papers to readN18 RNNs slides
N19 slides
N20 Transformer & Attention slides
N21 slides
16/0210:00MS Teams
recording
Q&ALi, Spies
16/0211:00MS Teams
T07 attention
T08 recurrent networks
TutorialSpies, Barmpas, Stacey & team
deadline: 23 Feb 2021, 19:00 Coursework Task 2assessedCoppock, Stacey
     
Week 7 (starting 22th February)   
23/0210:0022 GNNs, GCNsLectureBronstein
papers to readgeometric deep learning
23/0211:00MS TeamsQ&ABronstein, Li
23/0211:00MS Teams
T09 GCNs
Barmpas & team
deadline: 09 Mar 2021, 19:00 Coursework Task 3assessedStacey
 Quiz test your knowlege here  
Week 8 (starting 01st March)   
papers to read
02/0310:00live panel discussion about the hype MS TeamsQ&ALi, Kainz
02/0311:00MS Teams
revision
TutorialLi, Kainz, & team
 
deadline: 09 Mar 2021, 19:00  Coursework Task 3assessed Stacey
Week 9 (starting 8th March)   
no lecture or tutorial
 

Exam

Examinable material from lectures 01-14 is highlighted here with exclamation marks. Note, lectures 15-21 are also examinable but there are no exclamationa marks! The exam will count towards 50% of your final mark.

Coursework

There will be three practical coursework tasks; all of them are assessed. Assessment results count 50% of the final mark. Tasks must be implemented individually and submitted via CATe. Resulting jupyter notebook files and model weights need to be submitted in a zip archive.

We recommend using Google CoLab Paperspace.com with GPU support for testing.

The tasks are embedded into Jupyter notebooks, which also contain the task description.

Reading

Book: Dive into Deep Learning