COM3-02-52
660 15201

https://ml.comp.nus.edu.sg

Kenji KAWAGUCHI

NUS Presidential Young Professor

  • Postdoctoral Fellow, Harvard University
  • Ph.D. in Computer Science, MIT
  • S.M. in EECS, MIT

Kenji Kawaguchi is a Presidential Young Professor in the Department of Computer Science. Kenji Kawaguchi received his Ph.D. in Computer Science from Massachusetts Institute of Technology (MIT). He then joined Harvard University as a postdoctoral fellow. He was also an invited participant at the University of Cambridge, Isaac Newton Institute for Mathematical Sciences program on "Mathematics of Deep Learning". He was one of 77 invited participants from around the world. His research interests include deep learning, as well as artificial intelligence (AI) in general. His research group aims to have a positive feedback loop between theory and practice in deep learning research through collaborations with researchers from both practice and theory sides.

RESEARCH INTERESTS

  • Deep learning in theory and practice

  • Machine learning

  • Neural networks + X

RESEARCH PROJECTS

RESEARCH GROUPS

TEACHING INNOVATIONS

SELECTED PUBLICATIONS

  • Seanie Lee, Minki Kang, Juho Lee, Sung Ju Hwang and Kenji Kawaguchi. Self-Distillation for Further Pre-training of Transformers. In International Conference on Learning Representations (ICLR), 2023.
  • Samuel Lavoie, Christos Tsirigotis, Max Schwarzer, Ankit Vani, Michael Noukhovitch, Kenji Kawaguchi and Aaron Courville. Simplicial Embeddings in Self-Supervised Learning and Downstream Classification. In International Conference on Learning Representations (ICLR), 2023. [notable-top-25%]
  • Tianbo Li, Min Lin, Zheyuan Hu, Kunhao Zheng, Giovanni Vignale, Kenji Kawaguchi, A.H. Castro Neto, Kostya S. Novoselov and Shuicheng YAN. D4FT: A Deep Learning Approach to Kohn-Sham Density Functional Theory. In International Conference on Learning Representations (ICLR), 2023. [notable-top-25%]
  • Dong Bok Lee, Seanie Lee, Kenji Kawaguchi, Yunji Kim, Jihwan Bang, Jung-Woo Ha and Sung Ju Hwang. Self-Supervised Set Representation Learning for Unsupervised Meta-Learning. In International Conference on Learning Representations (ICLR), 2023.
  • Kenji Kawaguchi, Zhun Deng, Kyle Luh, Jiaoyang Huang. Robustness Implies Generalization via Data-Dependent Generalization Bounds. International Conference on Machine Learning (ICML), 2022. [Selected for ICML long presentation (2% accept rate)]
  • Aviv Navon, Aviv Shamsian, Idan Achituve, Haggai Maron, Kenji Kawaguchi, Gal Chechik, Ethan Fetaya. Multi-Task Learning as a Bargaining Game. International Conference on Machine Learning (ICML), 2022.
  • Linjun Zhang*, Zhun Deng*, Kenji Kawaguchi, James Zou. When and How Mixup Improves Calibration. International Conference on Machine Learning (ICML), 2022.
  • Juncheng Liu, Bryan Hooi, Kenji Kawaguchi and Xiaokui Xiao. MGNNI: Multiscale Graph Neural Networks with Implicit Layers. Advances in Neural Information Processing Systems (NeurIPS), 2022.
  • Riashat Islam, Hongyu Zang, Anirudh Goyal, Alex Lamb, Kenji Kawaguchi, Xin Li, Romain Laroche, Yoshua Bengio and Remi Tachet des Combes. Discrete Compositional Representations as an Abstraction for Goal Conditioned Reinforcement Learning. Advances in Neural Information Processing Systems (NeurIPS), 2022.
  • Seanie Lee*, Bruno Andreis*, Kenji Kawaguchi, Juho Lee and Sung Ju Hwang. Set-based Meta-Interpolation for Few-Task Meta-Learning. Advances in Neural Information Processing Systems (NeurIPS), 2022.
  • Zheyuan Hu, Ameya Jagtap, George Em Karniadakis and Kenji Kawaguchi. When Do Extended Physics-Informed Neural Networks (XPINNs) Improve Generalization?. SIAM Journal on Scientific Computing, 44 (5), pp. A3158-A3182, 2022.
  • Kenji Kawaguchi. On the Theory of Implicit Deep Learning: Global Convergence with Implicit Layers. In International Conference on Learning Representations (ICLR), 2021. [Selected for ICLR Spotlight (5% accept rate)]
  • Linjun Zhang*, Zhun Deng*, Kenji Kawaguchi*, Amirata Ghorbani and James Zou. How Does Mixup Help With Robustness and Generalization? In International Conference on Learning Representations (ICLR), 2021. [Selected for ICLR Spotlight (5% accept rate)]
  • Keyulu Xu*, Mozhi Zhang, Stefanie Jegelka and Kenji Kawaguchi*. Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth. International Conference on Machine Learning (ICML), 2021.
  • Vikas Verma, Minh-Thang Luong, Kenji Kawaguchi, Hieu Pham and Quoc V Le. Towards Domain-Agnostic Contrastive Learning. International Conference on Machine Learning (ICML), 2021.
  • Dianbo Liu*, Alex Lamb*, Kenji Kawaguchi, Anirudh Goyal, Chen Sun, Michael Curtis Mozer and Yoshua Bengio. Discrete-Valued Neural Communication. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ferran Alet*, Dylan Doblar*, Allan Zhou, Joshua B. Tenenbaum, Kenji Kawaguchi and Chelsea Finn. Noether Networks: meta-learning useful conserved quantities. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Zhun Deng, Linjun Zhang, Kailas Vodrahalli, Kenji Kawaguchi and James Zou. Adversarial Training Helps Transfer Learning via Better Representations. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Clement Gehring, Kenji Kawaguchi, Jiaoyang Huang, and Leslie Pack Kaelbling. Understanding End-to-End Model-Based Reinforcement Learning Methods as Implicit Parameterization. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ferran Alet, Maria Bauza Villalonga, Kenji Kawaguchi, Nurullah Giray Kuru, Tomas Lozano-Perez and Leslie Pack Kaelbling. Tailoring: encoding inductive biases by optimizing unsupervised objectives at prediction time. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Juncheng Liu, Kenji Kawaguchi, Bryan Hooi, Yiwei Wang and Xiaokui Xiao. EIGNN: Efficient Infinite-Depth Graph Neural Networks. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ameya D. Jagtap, Kenji Kawaguchi and George E. Karniadakis. Adaptive Activation Functions Accelerate Convergence in Deep and Physics-informed Neural Networks. Journal of Computational Physics, 404, 109136, 2020.
  • Ameya D. Jagtap*, Kenji Kawaguchi* and George E. Karniadakis. Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks. Proceedings of the Royal Society A, 476, 20200334, 2020.
  • Kenji Kawaguchi. Deep Learning without Poor Local Minima. Advances in Neural Information Processing (NeurIPS), 2016. [Selected for NeurIPS oral presentation (2% accept rate)]
  • Kenji Kawaguchi, Leslie Pack Kaelbling and Tomas Lozano-Perez. Bayesian Optimization with Exponential Convergence. Advances in Neural Information Processing (NeurIPS), 2015.

AWARDS & HONOURS

MODULES TAUGHT

CS5339
Theory and Algorithms for Machine Learning
CS6216
Advanced Topics in Machine Learning