COM2-03-32
660 15201

https://ml.comp.nus.edu.sg

Kenji KAWAGUCHI

NUS Presidential Young Professor

  • Postdoctoral Fellow, Harvard University
  • Ph.D. in Computer Science, MIT
  • S.M. in EECS, MIT

Kenji Kawaguchi is a Presidential Young Professor in the Department of Computer Science. He is also an invited participant at the University of Cambridge, Isaac Newton Institute for Mathematical Sciences program on "Mathematics of Deep Learning". He is one of 77 invited participants from around the world. Kenji Kawaguchi received his Ph.D. in Computer Science and S.M. in Electrical Engineering and Computer Science from Massachusetts Institute of Technology (MIT). He then joined Harvard University as a postdoctoral fellow. His research interests include deep learning, as well as artificial intelligence (AI) in general. His research group aims to have a positive feedback loop between theory and practice in deep learning research through collaborations with researchers from both practice and theory sides.

RESEARCH INTERESTS

  • Deep learning in theory and practice

  • Machine learning

  • Neural networks + X

RESEARCH PROJECTS

RESEARCH GROUPS

TEACHING INNOVATIONS

SELECTED PUBLICATIONS

  • Kenji Kawaguchi, Kyle Luh, Jiaoyang Huang, Zhun Deng. An Improved Analysis of Algorithmic Robustness. International Conference on Machine Learning (ICML), 2022. [Selected for ICML long presentation (2% accept rate)]
  • Aviv Navon, Aviv Shamsian, Idan Achituve, Haggai Maron, Kenji Kawaguchi, Gal Chechik, Ethan Fetaya. Multi-Task Learning as a Bargaining Game. International Conference on Machine Learning (ICML), 2022.
  • Linjun Zhang*, Zhun Deng*, Kenji Kawaguchi, James Zo. When and How Mixup Improves Calibration. International Conference on Machine Learning (ICML), 2022.
  • Kenji Kawaguchi. On the Theory of Implicit Deep Learning: Global Convergence with Implicit Layers. In International Conference on Learning Representations (ICLR), 2021. [Selected for ICLR Spotlight (5% accept rate)]
  • Linjun Zhang*, Zhun Deng*, Kenji Kawaguchi*, Amirata Ghorbani and James Zou. How Does Mixup Help With Robustness and Generalization? In International Conference on Learning Representations (ICLR), 2021. [Selected for ICLR Spotlight (5% accept rate)]
  • Keyulu Xu*, Mozhi Zhang, Stefanie Jegelka and Kenji Kawaguchi*. Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth. International Conference on Machine Learning (ICML), 2021.
  • Vikas Verma, Minh-Thang Luong, Kenji Kawaguchi, Hieu Pham and Quoc V Le. Towards Domain-Agnostic Contrastive Learning. International Conference on Machine Learning (ICML), 2021.
  • Dianbo Liu*, Alex Lamb*, Kenji Kawaguchi, Anirudh Goyal, Chen Sun, Michael Curtis Mozer and Yoshua Bengio. Discrete-Valued Neural Communication. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ferran Alet*, Dylan Doblar*, Allan Zhou, Joshua B. Tenenbaum, Kenji Kawaguchi and Chelsea Finn. Noether Networks: meta-learning useful conserved quantities. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Zhun Deng, Linjun Zhang, Kailas Vodrahalli, Kenji Kawaguchi and James Zou. Adversarial Training Helps Transfer Learning via Better Representations. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Clement Gehring, Kenji Kawaguchi, Jiaoyang Huang, and Leslie Pack Kaelbling. Understanding End-to-End Model-Based Reinforcement Learning Methods as Implicit Parameterization. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ferran Alet, Maria Bauza Villalonga, Kenji Kawaguchi, Nurullah Giray Kuru, Tomas Lozano-Perez and Leslie Pack Kaelbling. Tailoring: encoding inductive biases by optimizing unsupervised objectives at prediction time. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Juncheng Liu, Kenji Kawaguchi, Bryan Hooi, Yiwei Wang and Xiaokui Xiao. EIGNN: Efficient Infinite-Depth Graph Neural Networks. Advances in Neural Information Processing Systems (NeurIPS), 2021.
  • Ameya D. Jagtap, Kenji Kawaguchi and George E. Karniadakis. Adaptive Activation Functions Accelerate Convergence in Deep and Physics-informed Neural Networks. Journal of Computational Physics, 404, 109136, 2020.
  • Ameya D. Jagtap*, Kenji Kawaguchi* and George E. Karniadakis. Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks. Proceedings of the Royal Society A, 476, 20200334, 2020.
  • Kenji Kawaguchi. Deep Learning without Poor Local Minima. Advances in Neural Information Processing (NeurIPS), 2016. [Selected for NeurIPS oral presentation (2% accept rate)]
  • Kenji Kawaguchi, Leslie Pack Kaelbling and Tomas Lozano-Perez. Bayesian Optimization with Exponential Convergence. Advances in Neural Information Processing (NeurIPS), 2015.

AWARDS & HONOURS

MODULES TAUGHT

CS5339
Theory and Algorithms for Machine Learning