Unofficial pages of Deep Learning 1 at UniZg-FER
About the course
Deep learning is a branch of machine learning
that is particularly suitable for solving
non-linear recognition problems
in the field of artificial intelligence.
Deep learning maps input data into complex representations
through a composition of learned nonlinear transformations.
These methods find their application in challenging tasks
where the dimensionality of data is extremely large:
computer vision, natural language processing or speech comprehension.
This course introduces the most important
discriminative approaches with special emphasis on practical implementations.
Lectures
-
0. Introduction and basics
(pdf)
-
motivation for deep learning (§1.1)
-
about the course
-
basics of machine learning
(§5.1 - §5.5)
-
towards deep learning
(§5.11)
-
1. Fully Connected Discriminative Models (§6)
(pdf)
-
deep forward models (§6.1)
-
loss function (§6.2)
-
activation functions (§6.2 - §6.3)
-
universal approximation theorem (§6.4)
-
learning by backpropagation (§6.5)
-
2: Convolutional models for computer vision (§9)
(forward,
backward)
-
convolutional layers (§9.1 - §9.2, §9.5)
-
compression layers (§9.3)
-
efficient implementation (§9.8)
-
classification architecture (§9.4)
-
gradient calculation (-)
-
5: Recurrent models for natural language
(pdf)
-
sequence modeling (§10.1,10.2)
-
applications in natural language understanding (-)
-
6: Advanced recurrent Models
(pdf)
-
deep and bidirectional models (§10.3, §10.5)
-
gated recurrent cells (§10.10)
-
sequence-to-sequence translation (§10.4)
-
learning with attention (-)
-
7: Metric embeddings
(pdf)
-
contrastive and triple loss
-
implementation details and average precision
-
applications: stereoscopic correspondence,
self-supervised learning
Laboratory exercises
-
Logistic regression, gradient descent, Python, numpy:
instructions.
-
PyTorch, fully connected models, MNIST:
instructions;
-
Convolutional models, MNIST, CIFAR:
instructions;
-
Recurrent models:
instructions.
-
Metric embeddings:
instructions.
Literature
-
Ian Goodfellow, Yoshua Bengio and Aaron Courville.
Deep Learning.
MIT Press
(html)
-
Aston Zhang, Zack C. Lipton, Mu Li, Alex J. Smola.
Dive into Deep Learning.
(html)
-
Michael Nielsen.
Neural Networks and Deep Learning.
Determination press.
(html)
Exams
A successful completion of the lab exercises
is a prerequisite for taking the exam.
Please send us an e-mail as soon as you complete
any of the lab exercises in order to take
the lab assessment.
Our exams typically consist of
12 short theoretical questions
(you need to choose a corect answer, 30% points)
and 5-6 problems (70% points).
To get a feeling on how the problems might look like,
please have a look at our previous exams listed below.
-
Exercises with solutions by Marin Kačan 2024/25
(pdf)
-
Mid-term exam 2023/24
(pdf)
-
Final exam 2016/17
(txt)
Student projects
-
A minimal framework for reverse-mode automatic differentiation in Python (symbol-to-symbol).
Bruno Gavranović.
Seminar,
slides,
code.
Interesting links
-
Yann LeCun, Yoshua Bengio, Geoffrey Hinton.
Deep learning.
pdf
-
Pedro Domingos.
A Few Useful Things to Know about Machine Learning.
CACM 2012.
pdf
-
Awesome Deep Vision.
A curated list of deep learning resources
for computer vision.
html
-
Tensorflow - open source deep learning framework.
html
-
A neural network playground.
html
-
C. Olah.
Neural networks and data representations.
html
-
Convolutional Neural Networks for Visual Recognition.
Stanford CS programme.
html
-
Deep Learning for Natural Language Processing.
Stanford CS programme.
html
-
Deep Learning Courses.
html
-
Terence Tao.
Linear algebra.
html
-
Randal Barnes.
Matrix Calculus.
html
-
Eduardo Sontag.
VC Dimension of Neural Networks. NATO ASI Series F.
pdf