Algorithms & Theory

At the heart of all computer programs lie the mathematical foundations of computer science – theory and algorithms. ​

We delve into these theoretical concepts, understanding the fundamental abilities and limitations of the computational tools we work with, and use these insights to develop improved software and algorithms.

What We Do

Study the theory behind various algorithms, and explore different algorithm design paradigms.
Design and evaluate algorithms used in solving real-world problems.

Sub Areas

Our Research Projects

A Hybrid Approach to Automatic Programming

Prateek SAXENA

The project introduces an innovative approach that combines traditional program analysis, neural machine translation, and human guidance to enhance accuracy and generalization in automated programming tasks, thereby making coding accessible to non-experts.


Safety and Reliability in Black-Box Optimization

Jonathan SCARLETT

This project seeks to enhance safety, reliability, and robustness in black-box optimization, exploring new function structures and addressing limitations. This includes extending decision-making frameworks to grey-box settings and multi-agent learning, utilizing a methodology blending theoretical analyses and algorithm development.

  • Optimisation

Domination Properties, Ramsey-Type Theorems and Reverse Mathematics

Frank Christian STEPHAN

This project explores proof-theoretical strength in tree generalizations of Ramsey's Theory, investigating links to induction strength and nonstandard models. It addresses recursion theory, growth behavior, and constructs co-r.e. trees, posing hypotheses to advance understanding in mathematical logic.


Scalable Methodologies for Risk Assessment and Resilience of Critical Infrastructure During Pandemics

Divesh AGGARWAL

The COVID-19 pandemic has notably affected civic infrastructures, particularly public transportation. This project focuses on creating scalable methods for risk assessment and resilient methodologies to improve the adaptability and robustness of transportation services in the post-COVID world.


Non-Malleability and Randomness - Foundations of Leakage and Tamper Resilient Cryptography

Divesh AGGARWAL

This project aims to enhance cryptographic device security against active adversaries and combined leakage and tampering attacks by creating non-malleable objects like codes and extractors. It explores the interplay between randomness, leakage, and tamper-resilient cryptography, to surpass current outcomes.


Fault-tolerant Graph Structures: Efficient Constructions and Optimality

Diptarka CHAKRABORTY

This project aims to enhance graph problem solutions in the presence of network failures by developing fault-tolerant constructions and optimized graph structures. Through novel approaches, it contributes to algorithmic understanding, graph theory, and real-life applications.


You See the Whole Picture When the Puzzle Pieces Match

YU Haifeng


Massively Parallel Aggregation in Dynamic Networks

YU Haifeng


Rank Aggregation: Fairness and Computational Challenges

Diptarka CHAKRABORTY

  • Combinatorial Algorithms, Optimisation

Handling Massive Data under the Edit Metric: Clustering, Finding Median and Computational Hardness

Diptarka CHAKRABORTY

  • Combinatorial Algorithms, Optimisation

Computational Hardness Assumptions and the Foundations of Cryptography

Prashant Nalini VASUDEVAN

This program seeks to broaden and diversify the foundations of cryptography by identifying new plausible computational hardness assumptions that can be used to construct cryptosystems. Our current approach is to study and construct "fine-grained" cryptographic primitives based on the conjectured hardness of various well-studied algorithmic problems.

  • Complexity Theory, Cryptography

Algorithmic Solutions for Fair Resource Allocation

Warut SUKSOMPONG


Computational Methods for Tournament Selection

Warut SUKSOMPONG


Algorithmic fairness in machine learning

Reza SHOKRI


Auditing data privacy (in machine learning)

Reza SHOKRI


Theoretical foundations of data privacy in machine learning

Reza SHOKRI


Robustness and security in machine learning

Reza SHOKRI


Model explanations and interpretable machine learning

Reza SHOKRI


Theory and algorithms for Bayesian optimization

Jonathan SCARLETT

Bayesian optimization has emerged as a versatile tool for optimizing black-box functions, with particular success in automating machine learning algorithms (e.g., in the famous AlphaGo program). This project seeks to advance the state-of-the-art theory and algorithms, with an emphasis on practical variations that remain lesser-understood, including adversarial corruptions and high dimensionality.

  • Learning Theory, Optimisation

Modern methods for high-dimensional estimation and learning

Jonathan SCARLETT

Extensive research has led to a variety of powerful techniques for high-dimensional learning, with the prevailing approaches assuming low-dimensional structure such as sparsity and low-rankness. This project pursues a paradigm shift towards data-driven techniques, including the replacement of explicit modeling assumptions by implicit generative models based on deep neural networks.

  • Learning Theory

Theory and algorithms for group testing

Jonathan SCARLETT

Group testing is a classical sparse estimation problem that seeks to identify "defective" items by testing groups of items in pools, with applications including database systems, communication protocols, and COVID-19 testing. This project seeks to push recent advances further towards settings that better account for crucial practical phenomena, including noisy outcomes and prior information.

  • Combinatorial Algorithms, Information Theory & Coding, Learning Theory

Information-theoretic limits of statistical inference and learning problems

Jonathan SCARLETT

The field of information theory was introduced to understand the fundamental limits of data compression and transmission, and has shaped the design of practical communication systems for decades. This project pursues the emerging perspective that information theory is not only a theory of communication, but a far-reaching theory of data benefiting diverse inference and learning problems.

  • Information Theory & Coding, Learning Theory

Robustness considerations in machine learning

Jonathan SCARLETT

Robustness requirements pose many of the most important unsolved challenges in modern machine learning, arising from sources of uncertainty such as mismatched models, corrupted data, and adversaries. This project seeks to better understand some of the most practically pertinent sources of uncertainty and develop new algorithms that are robust in the face of this uncertainty.

  • Learning Theory

Our Research Groups

Deep Learning Lab

Kenji KAWAGUCHI

Our lab aims to establish the positive feedback loop between theory and practice, to accelerate the development of the practical deep learning methods and to contribute to the understanding of intelligence.

  • Optimisation

Information Theory and Statistical Learning Group

Jonathan SCARLETT

Our group performs research at the intersection of information theory, machine learning, and high-dimensional statistics, with ongoing areas of interest including information-theoretic limits of learning, adaptive decision-making under uncertainty, scalable algorithms for large-scale inference and learning, and robustness considerations in machine learning.

  • Information Theory & Coding, Learning Theory, Optimisation

STeAdS Virtual Group

Ganesh NEELAKANTA IYER

Software Engineering and Technological Advancements for Society. A virtual group that uses Software engineering practices and Technological advancements (Cloud computing, Artificial Intelligence (EdgeAI, ML)) for the benefit of various aspects of society (healthcare, education, art & culture). Looking for students to collaborate on different projects. Look at ganeshniyer.github.io for details.

  • Algorithmic Game Theory

Data Privacy and Trustworthy Machine Learning Lab

Reza SHOKRI