Skip to content

CSE-IITD @ ICLR 2025!🎉


CSE-IITD had couple of research papers to showcase at 13th International Conference on Learning Representations (ICLR) 2025, held at the Singapore EXPO.

The 1st one is from a team of students – Mridul Gupta, Samyak Jain, Vansh Ramani, Hariprasad Kodamana and Prof. Sayan Ranu from the Department. Their paper was about “Bonsai: Gradient-Free Graph Condensation for Node Classification”  — They addressed a fundamental scalability challenge in graph neural network training. Traditional graph condensation methods suffer from a critical flaw, They require training a full GNN on the original dataset to extract gradients, defeating the very purpose of condensation. The team’s approach took a different path entirely.

Bonsai introduces a novel gradient-free methodology that constructs condensed graphs by selecting representative computation trees. These trees capture how GNNs propagate information during message-passing operations. By leveraging the Weisfeiler-Lehman kernel to measure structural similarity between computation trees, a greedy selection algorithm was developed that identifies exemplar trees with high representative power and diversity. This approach, grounded in submodular optimization theory, generates compact graphs that maintain the learning capacity of the original dataset while being completely model-agnostic.

Their conference experience!

Presenting at ICLR gave the team exposure to world-leading researchers in machine learning and graph neural networks. The work sparked in-depth discussions with faculty and researchers from ETH Zurich, Google DeepMind, Microsoft Research, and others, particularly around Bonsai’s theoretical guarantees and computational complexity. Informal interactions with pioneers like Prof. Michael Bronstein, who personally visited the Bonsai poster, were especially meaningful for the team.


The second one is special as it has our undergrad student Poojan Shah(cs1221594) as the lead author. Poojan has been working under the guidance of Prof. Ragesh Jaiswal, and he has presented his research paper titled Quantum Inspired D² Sampling with Applications”.

The research centered on the idea of “Dequantization,” which is a framework for converting certain Quantum Machine Learning algorithms to fast classical algorithms. Many QML algorithms assume that the data input to the algorithm is provided in a QRAM (Quantum Random Access Memory). A classical analogue to this is called a “Sample and Query Access” (SQ) data structure, which performs a linear-time pre-processing. This allows to sample from a distribution which is equivalent to measuring the quantum state encoded by the QRAM. The team applied this framework to clustering problems, particularly the k-means++ algorithm, by first giving a quantum algorithm for k-means++, followed by dequantizing it to obtain a fast rejection sampling-based algorithm for k-means++.

Reflecting on his first international conference, Poojan shared –

“Since ICLR is one of the largest ML conferences, it attracts a large interdisciplinary audience. I was able to interact and discuss our research with people working in algorithms, clustering, and quantum computing during the poster sessions, along with being able to gain a broader perspective on machine learning through the keynote talks, which are targeted towards a wider audience. Since this was my first international research event, it allowed me to learn how to network with other researchers to try and build future collaborations. I am really grateful for this opportunity.”

Through his participation, Poojan not only showcased IIT Delhi’s contributions at the global frontiers of AI and quantum-inspired computation but also set an example for undergraduate research engagement at the institute.

The participation and travel of students were supported by the CSE Department and the CSE Alumni Research Acceleration Fund, which continues to enable CSE – IIT Delhi students to present cutting-edge ideas at international platforms.