Home / Academics / List of courses /
Deep learning
Course teacher
Marko Subašić, PhD, Associate Professor
Associate teachers
-
ECTS credits
5
Number of hours: Lectures + Seminars + Exercises
30 / 0 / 6
Course objectives
Acquire the basics of fully connected, convolutional, recurrent and generative deep models and master their application in practice.
Through numerous successful applications, the mentioned techniques have demonstrated a successful ability to learn and understand input data (eg. understanding scenes in image analysis). Such learning models and algorithms are to a greater or lesser extent inspired by biological neural networks and their functioning, but in the end they deviate significantly from them. Nevertheless, any successful deep learning model represents a potential explanation of the unknown functioning of biological neural networks, their learning and their understanding of input information.
Enrolment requirements and/or entry competences required for the course
-
Learning outcomes at the level of the programme to which the course contributes
- Apply theoretical knowledge of the fundamentals of the six core disciplines and their relationship within cognitive science.
- Critically evaluate cognitive science findings and synthesize information to be employed in a collaborative professional environment.
- Participate in data-driven innovation projects and apply appropriate data science tools.
- Apply AI tools in concrete tasks and practical contexts.
Course content (syllabus)
- Neural networks introduction
- Artificial neural network basic concepts and building blocks
- Artificial neural network basic architectures
- Artificial neural network basic learning algorithms
- Deep neural network strengths and weaknesses. Applications.
- Deep neural network specifics: activation functions, regularization, momentum, adaptive learning
- Deep neural network specifics: activation functions, regularization, momentum, adaptive learning
- Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
- Deep convolutional networks: layers, architectures, visualization, fine tuning, applications, implementation
- Fully convolutional networks: architectures, applications
- Interpretability of neutral network results
- Deep recurrent neural networks: RNN, bidirectional RNN, deep RNN, long short-term memory, sequence modelling, applications
- Deep recurrent neural networks: RNN, bidirectional RNN, deep RNN, long short-term memory, sequence modelling, applications
- Deep generative models: stacked RBMs, convolutional autoencoders, variational autoencoders, adversarial models
- Deep generative models: stacked RBMs, convolutional autoencoders, variational autoencoders, adversarial models
Student responsibilities
Class attendance. Engagement in classroom activities.
Required literature
- -
Optional literature
- Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep learning. MIT Press. 2017.
- Eli Stevens, Luca Antiga, Thomas Viehmann. Deep Learning with PyTorch. Manning Publications 2020.