Deep Learning Seminar, Summer 2019/20

In recent years, deep neural networks have been used to solve complex machine-learning problems and have achieved significant state-of-the-art results in many areas. The whole field of deep learning has been developing rapidly, with new methods and techniques emerging steadily.

The goal of the seminar is to follow the newest advancements in the deep learning field. The course takes form of a reading group – each lecture a paper is presented by one of the students. The paper is announced in advance, hence all participants can read it beforehand and can take part in the discussion of the paper.

If you want to receive announcements about chosen paper, sign up to our mailing list


SIS code: NPFL117
Semester: winter + summer
E-credits: 3
Examination: 0/2 C
Guarantor: Milan Straka

Timespace Coordinates

The Deep Learning Seminar takes place on Monday at 14:00 in S11. We will first meet on Monday Feb 24.


To pass the course, you need to present a research paper and sufficiently attend the presentations.

If you want to receive announcements about chosen paper, sign up to our mailing list

To add your name to a paper the table below, edit the source code on GitHub and send a PR.

Date Who Topic Paper(s)
24 Feb Milan Straka CNNs, AutoML Mingxing Tan, Quoc V. Le: EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
Mingxing Tan, Ruoming Pang, Quoc V. Le: EfficientDet: Scalable and Efficient Object Detection
02 Mar Jana Rezabkova Networks with External Memory Adam Santoro, Sergey Bartunov, Matthew Botvinick, Daan Wierstra, Timothy Lillicrap: One-shot Learning with Memory-Augmented Neural Networks
09 Mar Jonáš Kulhánek DL training, Symbolic DL, SRN Preetum Nakkiran, Gal Kaplun, Yamini Bansal, Tristan Yang, Boaz Barak, Ilya Sutskever: Deep Double Descent: Where Bigger Models and More Data Hurt
Guillaume Lample, François Charton: Deep Learning for Symbolic Mathematics
Vincent Sitzmann, Michael Zollhöfer, Gordon Wetzstein: Scene Representation Networks: Continuous 3D-Structure-Aware Neural Scene Representations
16 Mar Ondřej Měkota
Paper Summary
Transformer, Chatbot David R. So, Chen Liang, Quoc V. Le: The Evolved Transformer (blog)
Daniel Adiwardana, Minh-Thang Luong, David R. So, Jamie Hall, Noah Fiedel, Romal Thoppilan, Zi Yang, Apoorv Kulshreshtha, Gaurav Nemade, Yifeng Lu, Quoc V. Le: Towards a Human-like Open-Domain Chatbot (blog)
23 Mar Tomáš Kremel
Paper Summary
AutoML Hieu Pham, Melody Y. Guan, Barret Zoph, Quoc V. Le, Jeff Dean: Efficient Neural Architecture Search via Parameter Sharing
Hanxiao Liu, Karen Simonyan, Yiming Yang: DARTS: Differentiable Architecture Search
30 Mar Vastl Martin
Paper Summary
Transformers, BERT Anna Rogers, Olga Kovaleva, Anna Rumshisky A Primer in BERTology: What we know about how BERT works
06 Apr Štěpán Procházka Conditional Random Fields (papers TBA)
13 Apr No DL Seminar Easter Monday
20 Apr Kačka Macková
Paper Summary
Q&A Kelvin Guu, Kenton Lee, Zora Tung, Panupong Pasupat, Ming-Wei Chang. REALM: Retrieval-Augmented Language Model Pre-Training
27 Apr Jan Waltl Not sure yet.
04 May Ladislav Malecek TBA
11 May Marek Dobransky
Paper Summary
GAN Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio: Generative Adversarial Networks
Martin Arjovsky, Soumith Chintala, Léon Bottou: Wasserstein GAN
Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, Aaron Courville: Improved Training of Wasserstein GANs
18 May
22 Sep Sourabrata Mukherjee
Paper Summary
StyleTransfer Kelvin Guu, Kenton Lee, Zora Tung, Panupong Pasupat, Ming-Wei Chang. Style Transfer Through Back-Translation

You can choose any paper you find interesting, but if you would like some inspiration, you can look at the following list. The papers are grouped, each group is expected to be presented on one seminar.

Natural Language Processing

Generative Modeling

Neural Architecture Search (AutoML)

Networks with External Memory