NUS Presidential Young Professor
Department of Mathematics, National University of Singapore
Institute for Data Science, National University of Singapore

  • Ph.D. (Information Engineering, University of Cambridge, 2014)
  • B.Eng. (Electrical Engineering, University of Melbourne, 2010)
  • B.Sci. (Computer Science, University of Melbourne, 2010)

Jonathan is an assistant professor jointly in the Department of Computer Science and Department of Mathematics, National University of Singapore. His research interests are in the areas of information theory, machine learning, and high-dimensional statistics. In 2010, Jonathan received the B.Eng. degree in electrical engineering and the B.Sci. degree in computer science from the University of Melbourne, Australia. From October 2011 to August 2014, he was a Ph.D. student in the Signal Processing and Communications Group at the University of Cambridge, United Kingdom. From September 2014 to September 2017, he was a post-doctoral researcher with the Laboratory for Information and Inference Systems at the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland. He received the Singapore National Research Foundation (NRF) fellowship, and the 'EPFL Fellows' postdoctoral fellowship co-funded by Marie Curie.


Artificial Intelligence
Algorithms & Theory


  • Machine Learning

  • Information Theory

  • High-Dimensional Statistics

  • Bayesian Optimization

  • Group Testing


Information-theoretic limits of statistical inference and learning problems

The field of information theory was introduced to understand the fundamental limits of data compression and transmission, and has shaped the design of practical communication systems for decades. This project pursues the emerging perspective that information theory is not only a theory of communication, but a far-reaching theory of data benefiting diverse inference and learning problems.

Modern methods for high-dimensional estimation and learning

Extensive research has led to a variety of powerful techniques for high-dimensional learning, with the prevailing approaches assuming low-dimensional structure such as sparsity and low-rankness. This project pursues a paradigm shift towards data-driven techniques, including the replacement of explicit modeling assumptions by implicit generative models based on deep neural networks.

Theory and algorithms for group testing

Group testing is a classical sparse estimation problem that seeks to identify "defective" items by testing groups of items in pools, with applications including database systems, communication protocols, and COVID-19 testing. This project seeks to push recent advances further towards settings that better account for crucial practical phenomena, including noisy outcomes and prior information.

Theory and algorithms for Bayesian optimization

Bayesian optimization has emerged as a versatile tool for optimizing black-box functions, with particular success in automating machine learning algorithms (e.g., in the famous AlphaGo program). This project seeks to advance the state-of-the-art theory and algorithms, with an emphasis on practical variations that remain lesser-understood, including adversarial corruptions and high dimensionality.




  • Matthew Aldridge, Oliver Johnson, and Jonathan Scarlett, "Group testing: An information theory perspective,"Foundations and Trends in Communications and Information Theory, Volume 15, Issue 3-4, pp. 196-392, Dec. 2019.
  • Jonathan Scarlett, "Tight regret bounds for Bayesian optimization in one dimension,"International Conference on Machine Learning ICML, 2018.
  • Ilija Bogunovic, Jonathan Scarlett, Stefanie Jegelka, and Volkan Cevher, "Adversarially robust optimization with Gaussian processes,"Conference on Neural Information Processing Systems NeurIPS, 2018.
  • Jonathan Scarlett, Ilija Bogunovic, and Volkan Cevher, “Lower bounds on regret for noisy Gaussian process bandit optimization,” Conference on Learning Theory COLT, Amsterdam, 2017.
  • Jonathan Scarlett, "Noisy adaptive group testing: Bounds and algorithms," IEEE Transactions on Information Theory, Volume 65, Issue 6, pp. 3646-3661, June 2019.


  • Singapore National Research Foundation (NRF) Fellowship

  • NUS Early Career Research Award


Advanced Topics in Machine Learning