School of Computing

Deep Learning for NLP

NUS SoC, 2016/2017, Semester I
CS 6101 - Exploration of Computer Science Research

Last updated: Wednesday, July 5, 2017 05:38:47 PM SGT - This course is over. You may want to find the more recent iteration of this course.
Wednesday, August 31 - Added current list of guests, updated Session II as Monday
Friday, August 19 - Roster of students, venue and timings updated.
Thursday, July 28 - Initial page

This course is taken almost verbatim from CS 224D Deep Learning for Natural Language Processing -- Richard Socher's course at Stanford. We are following their course's formulation and selection of papers, with the permission of Socher.

This is a section of the CS 6101 Exploration of Computer Science Research at NUS. CS 6101 is a 4 modular credit pass/fail module for new incoming graduate programme students to obtain background in an area with an instructor's support. It is designed as a "lab rotation" to familiarize students with the methods and ways of research in a particular research area.

Our section will be conducted as a group seminar, with class participants nominating themselves and presenting the materials and leading the discussion. It is not a lecture-oriented course and not as in-depth as Socher's original course at Stanford, and hence is not a replacement, but rather a class to spur local interest in Deep Learning for NLP. Unlike the original course, there are no projects, assignments or homework, although you would certainly get more out of the topic if you did them.

This "course" is offered twice, for Session I (Weeks 3-7)and Session II (Weeks 8-13), although it is clear that the course is logically a single course that builds on the first half. Nevertheless, the material should be introductory and should be understandable given some prior study.

A discussion group will be on Slack . Students and guests, please login when you are free. If you have a email address you can create your Slack account for the group discussion without needing an invite.

Meeting Venue and Time

18:00-20:00, Tuesdays for Session I (Weeks 3-7). Venue is SR2 (COM1 #02-04).

18:00-20:00, Mondays for Session II (Weeks 8-13). Venue is SR10 (COM1 #02-10).

For directions to NUS School of Computing (SoC) and COM1: please read the directions here, to park in CP13 and/or take the bus to SoC. and use the floorplan to find SR2 and SR10.

Please eat before the course or during (we don't mind -- like a brown-bag-seminar series), and if you're parked, you will be able to leave after 7:30pm without paying carpark charges.


Welcome. If you are an external visitor and would like to join us, please email Kan Min-Yen to be added to the class role. Guests from industry, schools and other far-reaching places in SG welcome, pending space and time logistic limitations. The more, the merrier.

External guests will be listed here in due course once the course has started. Please refer to our Slack after you have been invited for the most up-to-date information.

NUS (Postgraduate): Session I (Weeks 3-7): Cai Shaofeng, Feng Piaopiao, Lim Xiang Hui Nicholas, Wang Kailong, Wang Taining

NUS (Postgraduate): Session II (Weeks 8-13): Tang Yixuan, Wang Yan, Yang Lin, Yang Yueji

NUS (Undergraduate): Edward Elson, Yap Jia Qing

WING: Muthu Kumar Chandrasekaran, Tao Chen, Xiangnan He, Min-Yen Kan, Manpreet Kaur, Lei Wenqiang, Animesh Prasad, Su Xuan, Kazunari Sugiyama, Chencan Xu

Guests: Ashutosh Gaur, Hitoshi Iwasaki, Phu Mon Htut, Nicolas Lim, Minh Nguyen, Shubham Goyal (, Lee Yi Jie Joel, Edwin Tam, Samdish Suri, Kristin Nguyen, Umamaheswari Vasanthakumar, Lonce Wyse


Session I
Week 3 (Week of 22 Aug)
Tue, 23 Aug, 18:00-20:00
Intro to NLP and Deep Learning
Presenters: Min
Week 4 (Week of 29 Aug)
Tue, 30 Aug, 18:00-20:00
Simple Word Vector representations: word2vec, GloVe
Presenters/Questioners: Wang Taining, Phu Mon Htut, Manpreet Kaur, Kristin Nguyen, Cheng Yong, Yi Chiao Cheng
Week 5 (Week of 5 Sep)
Tue, 6 Sep, 18:00-20:00
Advanced word vector representations: language models, softmax, single layer networks
Presenters/Questioners: Lim Xiang Hui Nicholas, Wang Taining, Cai Shaofeng, Feng Shaoyu
Week 6 (Week of 12 Sep)
Tue, 13 Sep, 18:00-20:00
Neural Networks and backpropagation -- for named entity recognition
Presenters/Questioners: Joel Lee, Feng Piaopiao, Wang Kailong, Lim Xiang Hui Nicholas, Feng Shaoyu
Recess Week (Week of 19 Sep)
Tue, 20 Sep, 18:00-20:00
Neural Networks and Back-Propagation and
Practical tips: gradient checks, overfitting, regularization, activation functions, details

Presenters/Questioners: Joel Lee, Cheng Yong
Week 7 (Week of 26 Sep)
Tue, 27 Sep, 18:00-20:00
Introduction to Tensorflow
Presenters: Tan Liling, Kristin Nguyen, Rafael E. Banchs, Chan Yihao, Luis Fernando D'Haro, Wang Kailong, Feng Piaopiao
Session II
Week 8 (Week of 3 Oct)
Mon, 3 Oct, 18:00-20:00
Recurrent neural networks -- for language modeling and other tasks
Presenters/Questioners: Connie Kou, Muthu Kumar Chandrasekaran, Rafael E. Banchs, Yang Yueji
Week 9 (Week of 10 Oct)
Mon, 10 Oct, 18:00-20:00
GRUs and LSTMs -- for machine translation
Presenters: Connie Kou, Umamaheswari Vasanthakumar, Muthu Kumar Chandrasekaran, Phu Mon Htut
Week 10 (Week of 17 Oct)
Mon, 17 Oct, 18:00-20:00
Recursive neural networks -- for parsing
Presenters/Questioners: Shankar Satish, Umamaheswari Vasanthakumar, Manpreet Kaur
Week 11 (Week of 24 Oct)
Mon, 24 Oct, 18:00-20:00
Recursive neural networks -- for different tasks (e.g. sentiment analysis)
Presenters/Questioners: Shankar Satish, Tang Yixuan, Yap Jia Qing, Edwin Tam
Week 12 (Week of 31 Oct)
Mon, 31 Oct, 18:00-20:00
Convolutional neural networks -- for sentence classification
Presenters/Questioners: Yap Jia Qing, Wang Yan
Week 13 (Week of 7 Nov)
Mon, 7 Nov, 18:00-20:00
The future of Deep Learning for NLP: Dynamic Memory Networks
Presenters/Questioners: Luis Fernando D'Haro, Shankar Satish, Tang Yixuan, Yang Yueji

Other Links