Глубинное обучение (курс лекций)/2019

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
Текущая версия (14:51, 27 декабря 2019) (править) (отменить)
 
(47 промежуточных версий не показаны.)
Строка 4: Строка 4:
'''Instructors''': [[Участник:Kropotov|Dmitry Kropotov]], [[Участник:Victor Kitov|Victor Kitov]], Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.
'''Instructors''': [[Участник:Kropotov|Dmitry Kropotov]], [[Участник:Victor Kitov|Victor Kitov]], Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.
-
E-mail for questions: ''bayesml@gmail.com''. Please include in subject the tag [CMC DL19].
+
The timetable in Autumn 2019: Mondays, lectures begin at 10-30, seminars begin at 12-15, room 526b.
-
The timetable in Spring 2019: Fridays, most lectures begin at 14-35, seminars begin at 16-20. Exact place and time are given in tables below.
+
For questions: [https://t.me/joinchat/DEBCqg81GIyo4YNkdQtqzw course chat in Telegram]
-
[https://t.me/joinchat/DEBCqg_Y08322lq6WRqONg Link to a chat]
+
== News ==
-
== Announcements ==
+
'''09 Sep:''' Today's lecture is cancelled. Seminar will start normally at 12-15.
 +
 
 +
'''06 Sep:''' First theoretical assignment is uploaded to anytask. Deadline: '''15 Sep'''. Please note: this is a strict deadline, no delay is possible.
 +
 
 +
'''01 Oct:''' Second practical assignment is uploaded to anytask. Deadline: '''15 Oct'''.
 +
 
 +
== Exam ==
 +
Exam is scheduled on 13 Jan at room 523. For students from 517 group exam starts at 11-00, for others - at 14-00.
 +
 
 +
[https://drive.google.com/file/d/1SPfezUuIVTVKmrZaQ9-uMQC7_C1IrTpr/view?usp=sharing Exam questions]
== Rules and grades ==
== Rules and grades ==
-
We have 5 practical assignments during the course. For each assignment, a student may get up to 5 points. A student is allowed to upload his fulfilled assignment during one week after deadline with grade reduction of 0.1 points per day. All assignments are prepared in English.
+
We have 7 home assignments during the course. For each assignment, a student may get up to 10 points + possibly bonus points. For some assignments a student is allowed to upload his fulfilled assignment during one week after deadline with grade reduction of 0.5 points per day. All assignments are prepared in English.
-
The final grade for the course is calculated as follows: Round-up (0.3*<Exam_grade> + 0.7*<Semester_grade>). For the final grade 5 it is necessary to fulfill all practical assignments and get >= 4 exam grade. For the final grade 4 it necessary to fulfill at least 4 practical assignments and get >= 3 exam grade. For the final grade 3 it is necessary for fulfill at least 3 practial assignments and get >=3 exam grade.
+
Also each student may give a small 10-minutes talk in English on some recent DL paper. For this talk a student may get up to 5 points.
 +
 
 +
The total grade for the course is calculated as follows: Round-up (0.3*<Exam_grade> + 0.7*<Semester_grade>), where <Semester_grade> = min(10, (<Assignments_total_grade> + <Talk_grade>) / 7), <Exam_grade> is a grade for the final exam (up to 10 points).
 +
{| class="standard"
 +
!Final grade !! Total grade !! Necessary conditions
 +
|-
 +
| 5 || >=8 || all practical assignments are done, exam grade >= 6 and oral talk is given
 +
|-
 +
| 4 || >=6 || 6 practical assignments are done, exam grade >= 4
 +
|-
 +
| 3 || >=4 || 3 practical assignments are done, exam grade >= 4
 +
|-
 +
|}
== Practical assignments ==
== Practical assignments ==
-
Practical assignments are provided on course page in ''anytask.org''. Invite code: bgvpqVE
+
Practical assignments are provided on course page in ''anytask.org''. Invite code: IXLOwZU
== Lectures ==
== Lectures ==
{| class="standard"
{| class="standard"
-
!Date !! No. !! Place and time !! Topic !! Materials
+
!Date !! No. !! Topic !! Materials
|-
|-
-
| 15&nbsp;Feb.&nbsp;2019 || align="center"|1 || r.685, 14-35 || Introduction. Automatic differentiation. ||
+
| 02&nbsp;Sep.&nbsp;2019 || align="center"|1 || Introduction. Fully-connected networks. ||
|-
|-
-
| 22&nbsp;Feb.&nbsp;2019 || align="center"|2 || r.685, 14-35 || Optimization and regularization methods for neural networks ||
+
| 16&nbsp;Sep.&nbsp;2019 || align="center"|2 || Optimization and regularization for neural networks. Convolutional neural networks. || [https://drive.google.com/file/d/1d2Pn4Lb15rQ1p-z1SeOzuh9kmvSV4d3I/view?usp=sharing Presentation (pptx)]<br> [https://drive.google.com/file/d/1sKdXAc1iSSNy4ZrRaPs2UXuWHZE29Bnc/view?usp=sharing Presentation (pdf)]<br> [https://arxiv.org/pdf/1412.6980.pdf A paper about ADAM]<br> [http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf A paper about DropOut]<br> [https://arxiv.org/pdf/1502.03167.pdf A paper about BatchNorm]
|-
|-
-
| 01&nbsp;Mar.&nbsp;2019 || align="center"|3 || r.612, 8-45 ||Convolutional neural networks for image classification problem ||
+
| 23&nbsp;Sep.&nbsp;2019 || align="center"|3 || Semantic image segmentation || [https://yadi.sk/i/qaR_c-9fE0G0Nw Presentation (pdf)]<br>[https://portrait.nizhib.ai/ Portrait Demo] ([https://github.com/nizhib/portrait-demo source])
|-
|-
-
| 15&nbsp;Mar.&nbsp;2019 || align="center"|4 || r.526b, 14-35 ||Convolutional neural networks for image segmentation problem ||
+
| 30&nbsp;Sep.&nbsp;2019 || align="center"|4 || Object detection on images || [https://yadi.sk/i/WN0IV9TKBkvh5A Presentation (pdf)]<br>[https://yadi.sk/i/WN0IV9TKBkvh5A DS Bowl 2018 report (pdf)]
|-
|-
-
| 22&nbsp;Mar.&nbsp;2019 || align="center"|5 || r.526b, 14-35 ||Object detection and localization on images ||
+
| 07&nbsp;Oct.&nbsp;2019 || align="center"|5 || Image style transfer || Presentations [https://yadi.sk/i/1wCKBk6nt3IeLA 1], [https://yadi.sk/i/hl9i5Y1fuPr4Qg 2], [https://yadi.sk/i/eDHSVCRPNTxLpA 3], [https://yadi.sk/i/v-Oc7h8eTgIN-A 4], [https://yadi.sk/i/xwGRhEXa3cl9uA 5], [https://yadi.sk/i/rXAgWiDTLzgnsQ 6].
|-
|-
-
| 29&nbsp;Mar.&nbsp;2019 || align="center"|6 || r.612, 8-45 ||Image style transfer ||
+
| 14&nbsp;Oct.&nbsp;2019 || align="center"|6 || Recurrent neural networks || [https://drive.google.com/file/d/1KvSzzctOjRhYwJH_9LJJeZhMp4USTcDV/view?usp=sharing Presentation (pdf)]
|-
|-
-
| 05&nbsp;Apr.&nbsp;2019 || align="center"|7 || r.526b, 14-35 ||Recurrent neural networks ||
+
| 21&nbsp;Oct.&nbsp;2019 || align="center"|7 || Attention in recurrent neural networks || [https://arxiv.org/abs/1502.03044 A paper about attention model for image captioning]<br> [https://arxiv.org/abs/1706.03762 A paper about Transformer]<br> [https://arxiv.org/abs/1810.04805 A paper about BERT]
|-
|-
-
| 12&nbsp;Apr.&nbsp;2019 || align="center"|8 || r.526b, 14-35 ||Attention mechanism ||
+
| 28&nbsp;Oct.&nbsp;2019 || align="center"|8 || Variational autoencoder || [https://arxiv.org/abs/1312.6114 A paper about VAE]
|-
|-
-
| 19&nbsp;Apr.&nbsp;2019 || align="center"|9 || r.526b, 14-35 ||Generative adversarial networks ||
+
| 11&nbsp;Nov.&nbsp;2019 || align="center"|9 || Generative adversarial networks || [https://yadi.sk/i/RSpcVQ897ooy2A Presentation]
|-
|-
-
| 26&nbsp;Apr.&nbsp;2019 || align="center"|10 || r.526b, 14-35 ||Riemannian optimization ||
+
| 18&nbsp;Nov.&nbsp;2019 || align="center"|10 || Reinforcement learning. Q-learning, DQN. || [https://drive.google.com/file/d/1Z4W_-0IaMNpZnhnMkqcDVM_EA79GFJo-/view RL book], chapter 6<br> [https://www.cs.toronto.edu/~vmnih/docs/dqn.pdf A paper about DQN]
|-
|-
-
| 17&nbsp;May&nbsp;2019 || align="center"|11 || r.526b, 14-35 || ||
+
| 25&nbsp;Nov.&nbsp;2019 || align="center"|11 || Policy gradient in reinforcement learning || [https://drive.google.com/file/d/1Z4W_-0IaMNpZnhnMkqcDVM_EA79GFJo-/view RL book], chapter 13<br> [https://arxiv.org/abs/1602.01783 A paper about A3C]
 +
|-
 +
| 02&nbsp;Dec.&nbsp;2019 || align="center"|12 || Implicit reparameterization trick. Gumbel-Softmax approach for discrete reparameterization. || [https://arxiv.org/abs/1805.08498 A paper about IRT]<br> [https://arxiv.org/abs/1611.01144 A paper about Gumbel-Softmax]
 +
|-
 +
| 09&nbsp;Dec.&nbsp;2019 || align="center"|13 || Students' talks ||
|-
|-
|}
|}
Строка 52: Строка 77:
{| class="standard"
{| class="standard"
-
!Date !! No. !! Place and time !! Topic !! Need laptops !! Materials
+
!Date !! No. !! Topic !! Need laptops !! Materials
 +
|-
 +
| 2&nbsp;Sep.&nbsp;2019 || align="center"|1 || Matrix calculus, automatic differentiation. || No || [https://drive.google.com/file/d/1Yu790uIPyxp9JIyysxfJDor_LJQu83gQ/view?usp=sharing Synopsis]<br> [https://people.maths.ox.ac.uk/gilesm/files/NA-08-01.pdf pdf]
 +
|-
 +
| 9&nbsp;Sep.&nbsp;2019 || align="center"|2 || Introduction to Pytorch || Yes || [https://github.com/nadiinchi/dl_labs/blob/master/lab_pytorch.ipynb ipynb]
 +
|-
 +
| 16&nbsp;Sep.&nbsp;2019 || align="center"|3 || Convolutional neural networks on Pytorch || Yes || [https://github.com/nadiinchi/dl_labs/blob/master/lab_cnn_english.ipynb ipynb]
 +
|-
 +
| 23&nbsp;Sep.&nbsp;2019 || align="center"|4 || Semantic image segmentation || No ||
|-
|-
-
| 15&nbsp;Feb.&nbsp;2019 || align="center"|1 || r.685, 16-20 || Automatic differentiation. || No || [https://www.cs.ox.ac.uk/files/723/NA-08-01.pdf Notes on backprop]
+
| 30&nbsp;Sep.&nbsp;2019 || align="center"|5 || Face recognition || No ||
|-
|-
-
| 22&nbsp;Feb.&nbsp;2019 || align="center"|2 || r.685, 16-20 || Introduction to Azure and Pytorch || Yes ||
+
| 07&nbsp;Oct.&nbsp;2019 || align="center"|6 || Image style transfer || No ||
|-
|-
-
| 01&nbsp;Mar.&nbsp;2019 || align="center"|3 || r.526b, 16-20 || Convolutional neural networks for MNIST || Yes ||
+
| 14&nbsp;Oct.&nbsp;2019 || align="center"|7 || DropOut for recurrent neural networks || Yes || [https://github.com/nadiinchi/dl_labs/blob/master/lab_rnn_english.ipynb ipynb]
|-
|-
-
| 15&nbsp;Mar.&nbsp;2019 || align="center"|4 || r.526b, 16-20 || Deep learning contests || No ||
+
| 21&nbsp;Oct.&nbsp;2019 || align="center"|8 || Models with attention || Yes ||
|-
|-
-
| 22&nbsp;Mar.&nbsp;2019 || align="center"|5 || r.526b, 16-20 || Face recognition || No ||
+
| 28&nbsp;Oct.&nbsp;2019 || align="center"|9 || Variational autoencoders, adversarial attacks || Yes ||
-
|-
+
-
| 29&nbsp;Mar.&nbsp;2019 || align="center"|6 || r.526b, 14-35 || Image style transfer || No ||
+
|-
|-
-
| 05&nbsp;Apr.&nbsp;2019 || align="center"|7 || r.526b, 16-20 || Recurrent neural networks || Yes ||
+
| 11&nbsp;Nov.&nbsp;2019 || align="center"|10 || Generative adversarial networks || Yes ||
|-
|-
-
| 12&nbsp;Apr.&nbsp;2019 || align="center"|8 || r.526b, 16-20 || Attention mechanism || Yes ||
+
| 18&nbsp;Nov.&nbsp;2019 || align="center"|11 || Bandits || No || [https://drive.google.com/file/d/1Z4W_-0IaMNpZnhnMkqcDVM_EA79GFJo-/view RL book], chapter 2<br> [https://web.stanford.edu/~bvr/pubs/TS_Tutorial.pdf Tompson Sampling Tutorial]
|-
|-
-
| 19&nbsp;Apr.&nbsp;2019 || align="center"|9 || r.526b, 16-20 || Generative adversarial networks || No ||
+
| 25&nbsp;Nov.&nbsp;2019 || align="center"|12 || Actor-critic approach in RL || No ||
|-
|-
-
| 26&nbsp;Apr.&nbsp;2019 || align="center"|10 || r.526b, 16-20 || Riemannian optimization || No ||
+
| 02&nbsp;Dec.&nbsp;2019 || align="center"|13 || Semi-supervised discrete VAE || Yes ||
|-
|-
-
| 17&nbsp;May&nbsp;2019 || align="center"|11 || r.526b, 16-20 || || ||
+
| 09&nbsp;Dec.&nbsp;2019 || align="center"|14 || Students' talks || No ||
|-
|-
|}
|}

Текущая версия

This is an introductory course on deep learning models and their application for solving different problems of image and text analysis.

Instructors: Dmitry Kropotov, Victor Kitov, Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.

The timetable in Autumn 2019: Mondays, lectures begin at 10-30, seminars begin at 12-15, room 526b.

For questions: course chat in Telegram

News

09 Sep: Today's lecture is cancelled. Seminar will start normally at 12-15.

06 Sep: First theoretical assignment is uploaded to anytask. Deadline: 15 Sep. Please note: this is a strict deadline, no delay is possible.

01 Oct: Second practical assignment is uploaded to anytask. Deadline: 15 Oct.

Exam

Exam is scheduled on 13 Jan at room 523. For students from 517 group exam starts at 11-00, for others - at 14-00.

Exam questions

Rules and grades

We have 7 home assignments during the course. For each assignment, a student may get up to 10 points + possibly bonus points. For some assignments a student is allowed to upload his fulfilled assignment during one week after deadline with grade reduction of 0.5 points per day. All assignments are prepared in English.

Also each student may give a small 10-minutes talk in English on some recent DL paper. For this talk a student may get up to 5 points.

The total grade for the course is calculated as follows: Round-up (0.3*<Exam_grade> + 0.7*<Semester_grade>), where <Semester_grade> = min(10, (<Assignments_total_grade> + <Talk_grade>) / 7), <Exam_grade> is a grade for the final exam (up to 10 points).

Final grade Total grade Necessary conditions
5 >=8 all practical assignments are done, exam grade >= 6 and oral talk is given
4 >=6 6 practical assignments are done, exam grade >= 4
3 >=4 3 practical assignments are done, exam grade >= 4

Practical assignments

Practical assignments are provided on course page in anytask.org. Invite code: IXLOwZU

Lectures

Date No. Topic Materials
02 Sep. 2019 1 Introduction. Fully-connected networks.
16 Sep. 2019 2 Optimization and regularization for neural networks. Convolutional neural networks. Presentation (pptx)
Presentation (pdf)
A paper about ADAM
A paper about DropOut
A paper about BatchNorm
23 Sep. 2019 3 Semantic image segmentation Presentation (pdf)
Portrait Demo (source)
30 Sep. 2019 4 Object detection on images Presentation (pdf)
DS Bowl 2018 report (pdf)
07 Oct. 2019 5 Image style transfer Presentations 1, 2, 3, 4, 5, 6.
14 Oct. 2019 6 Recurrent neural networks Presentation (pdf)
21 Oct. 2019 7 Attention in recurrent neural networks A paper about attention model for image captioning
A paper about Transformer
A paper about BERT
28 Oct. 2019 8 Variational autoencoder A paper about VAE
11 Nov. 2019 9 Generative adversarial networks Presentation
18 Nov. 2019 10 Reinforcement learning. Q-learning, DQN. RL book, chapter 6
A paper about DQN
25 Nov. 2019 11 Policy gradient in reinforcement learning RL book, chapter 13
A paper about A3C
02 Dec. 2019 12 Implicit reparameterization trick. Gumbel-Softmax approach for discrete reparameterization. A paper about IRT
A paper about Gumbel-Softmax
09 Dec. 2019 13 Students' talks

Seminars

Date No. Topic Need laptops Materials
2 Sep. 2019 1 Matrix calculus, automatic differentiation. No Synopsis
pdf
9 Sep. 2019 2 Introduction to Pytorch Yes ipynb
16 Sep. 2019 3 Convolutional neural networks on Pytorch Yes ipynb
23 Sep. 2019 4 Semantic image segmentation No
30 Sep. 2019 5 Face recognition No
07 Oct. 2019 6 Image style transfer No
14 Oct. 2019 7 DropOut for recurrent neural networks Yes ipynb
21 Oct. 2019 8 Models with attention Yes
28 Oct. 2019 9 Variational autoencoders, adversarial attacks Yes
11 Nov. 2019 10 Generative adversarial networks Yes
18 Nov. 2019 11 Bandits No RL book, chapter 2
Tompson Sampling Tutorial
25 Nov. 2019 12 Actor-critic approach in RL No
02 Dec. 2019 13 Semi-supervised discrete VAE Yes
09 Dec. 2019 14 Students' talks No

Arxiv

2017

2016