Deep Learning for NLP
NUS SoC, 2018/2019, Semester I
CS 6101 - Exploration of Computer Science Research
This course is taken almost verbatim from CS 224N Deep Learning for Natural Language Processing – Richard Socher’s course at Stanford. We are following their course’s formulation and selection of papers, with the permission of Socher.
This is a section of the CS 6101 Exploration of Computer Science Research at NUS. CS 6101 is a 4 modular credit pass/fail module for new incoming graduate programme students to obtain background in an area with an instructor’s support. It is designed as a “lab rotation” to familiarize students with the methods and ways of research in a particular research area.
Our section will be conducted as a group seminar, with class participants nominating themselves and presenting the materials and leading the discussion. It is not a lecture-oriented course and not as in-depth as Socher’s original course at Stanford, and hence is not a replacement, but rather a class to spur local interest in Deep Learning for NLP.
This course is offered twice, for Session I (Weeks 3-7) and Session II (Weeks 8-13), although it is clear that the course is logically a single course that builds on the first half. Nevertheless, the material should be introductory and should be understandable given some prior study.
A discussion group is on Slack. Students and guests, please login when you are free. If you have a @comp.nus.edu.sg, @u.nus.edu, @nus.edu.sg, @a-star.edu.sg, @dsi.a-star.edu.sg or @i2r.a-star.edu.sg. email address you can create your Slack account for the group discussion without needing an invite.
For interested public participants, please send Min an email at
firstname.lastname@example.org if you need an invite to the Slack group. The Slack group is being reused from previous semesters. Once you are in the Slack group, you can consider yourself registered.
- What are the pre-requisites? There are no formal prerequisites for the course. As with many machine learning courses, it would be useful to have basic understanding of linear algebra in probability and statistics. Taking online, open courses on these subjects concurrently or before the course would be advisable if you do not have to requisite understanding.
- Is the course chargeable? No, the course is not chargeable. It is free (as in no-fee). NUS allows us to teach this course for free, as it is not “taught”, per se. Students in the class take charge of the lectures, and complete a project, while the teaching staff facilitates the experience.
Can I get course credit for taking this? Yes, if you are a first-year School of Compu ting doctoral student. In this case you need to formally enroll in the course as CS6101, And you will receive one half of the 4-MC pass/fail credit that you would receive for the course, which is a lab rotation course. Even though the left rotation is only for half the semester, such students are encouraged and welcome to complete the entire course.
No, for everyone else. By this we mean that no credits, certificate, or any other formal documentation for completing the course will be given to any other participants, inclusive of external registrants and NUS students (both internal and external to the School of Computing). Such participants get the experience of learning deep learning together in a formal study group in developing the camaraderie and network from fellow peer students and the teaching staff.
- What are the requirements for completing the course? Each student must achieve 2 objectives to be deemed to have completed the course:
- Work with peers to assist in teaching two lecture sessions of the course: One lecture by co-lecturing the subject from new slides that you have prepared a team; and another lecture by moderating of the Slack channel to add materials for discussion.
- Complete a deep learning project. For the project, you only need to use any deep learning framework to execute a problem against a data set. You may choose to replicate previous work orders in scientific papers or data science challenges. Or more challengingly, you may decide to use data from your own context.
How do external participants take this course? You may come to NUS to participate in the lecture concurrently with all of our local participants. You are also welcome to participate online through Google Hangouts. We typically have a synchronous broadcast to Google Hangouts that is streamed and archived to YouTube.
During the session where you’re responsible for co-lecturing, you will be expected to come to the class in person.
As an external participant, you are obligated to complete the course to best your ability. We do not encourage students who are not committed to completing the course to enrol.
Meeting Venue and Time
18:00-20:00, likely Thursdays for Session I (Weeks 3-7). Venue TBA
18:00-20:00, likely Thursdays for Session II (Weeks 8-13). Venue TBA
Please eat before the course or during (we don’t mind – like a brown-bag-seminar series), and if you’re parked, you will be able to leave after 7:30pm without paying carpark charges.
Welcome. If you are an external visitor and would like to join us, please email Kan Min-Yen to be added to the class role. Guests from industry, schools and other far-reaching places in SG welcome, pending space and time logistic limitations. The more, the merrier.
External guests will be listed here in due course once the course has started. Please refer to our Slack after you have been invited for the most up-to-date information.
NUS (Postgraduate): Session I (Weeks 3-7): TBA
NUS (Postgraduate): Session II (Weeks 8-13): TBA
NUS (Undergraduate): TBA
Week of 13, 20 Aug
|Introduction to NLP and Deep Learning, Word Vectors 1, & Word Vectors 2|
|Neural Networks, Backpropagation|
|Recurrent Neural Networks and Language Models|
|Vanishing Gradients, Fancy RNNs||Preliminary project titles and team members due on Slack's
|Machine Translation, Seq2Seq and Attention|
|Advanced Attention||Preliminary abstracts due to
|Transformer Networks and CNNs|
|Tree Recursive Neural Networks and Constituency Parsing|
|Advanced Architectures and Memory Networks|
|Reinforcement Learning for NLP|
|Semi-supervised Learning for NLP||Participation on evening of 13th STePS|