About me
I am an applied research scientist in the GTAR team at JP Morgan Chase, where I work on randomized methods with applications in quantum-inspired and machine learning algorithms.
Previously, I was postdoc in the Medical Physics group at the Sloan Kettering Institute, where I was hosted by Masoud Zarepisheh and Mojtaba Tefagh. Before that I graduated with a PhD from the Manning College of Information and Computer Sciences at the University of Massachusetts, Amherst. I was extremely fortunate to be advised by Cameron Musco and proud to have been a member of the Theory Group. My work focuses on approximation of matrix methods using fast linear algebra and online learning with applications to real world problems. Broadly, my work can be seen as a set of tools that can be utilized to accomplish various tasks within machine learning and particularly in neural networks, including model compression, designing fast learning functions, and the study of memorization. I am especially interested in the overarching goal of finding adaptive learning functions for large datasets.
Prior to this I worked as Visiting Researcher in Electronics and Communication Sciences Unit, Indian Statistical Institute, Kolkata with Professor Dipti Prasad Mukherjee. I completed my Masters in Computer Science in the year 2015, from the same institute. My primary research interest was in non-linear learning models for pre-image computation of image datasets.
In my leisure I enjoy playing guitar, badminton and birding. I am an ardent fan of absurdist fiction, philosophical fiction, epic fictions and graphic novels. I am absolute fan of Seinfeld and consider it the pinnacle of observational and conversational comedy.
News
- (October 2024) Our paper on Improved Spectral Density Estimation via Explicit and Implicit Deflation was accepted for publication at SODA 2025.
- (January 2024) Our extended paper on Sublinear Time Eigenvalue Approximation via Random Sampling was accepted for publication at Algorithmica 2024.
- (November 2023) Our paper on Universal Matrix Sparsifiers and Fast Deterministic Algorithms for Linear Algebra was accepted at ITCS 2024.
- (September 2023) I successfully defended my PhD thesis. Will post slides soon!
- (July 2023) Presented our work on Sublinear Time Eigenvalue Approximation via Random Sampling at ICALP 2023.
- (April 2023) Our paper on Sublinear Time Eigenvalue Approximation via Random Sampling was accepted for publication at ICALP 2023.
- (February 2023) I passed my Thesis Proposal Defense.
- (January 2023) Awarded CICS Dissertation Writing Fellowship for Spring 2023.
- (August 2022) Presented our work on Sublinear Time Eigenvalue Approximation via Random Sampling at FODSI Sublinear Algorithms Workshop 2022. [link to poster]
- (January 2022) Awarded AAAI-22 Student Scholarship.
- (December 2021) Our paper on Sublinear Time Approximation of Text Similarity Matrices was accepted for publication at AAA1 2022.
- (August 2021) Presented our work on Estimating Eigenvalues of Symmetric Matrices using Random Submatrices at WALD(O) 2021. [link to poster]
- (June 2021) Presented our work on Kernel approximation in sliding window models at WOLA 2021. [link to poster]
About this website
Powered by Jekyll. Theme by AcademicPages, a fork of Minimal Mistakes, some edits including this acknowledgement were borrowed from Chris Severen, Ian Gemp and Blossom Metevier. Hosted on Github Pages.