Research

Research Interest

Broadly I am interested in developing computationally efficient algorithms. These methods have become extremely relevant in today’s world of million sized datasets and massive parametrized learning models. More recently I have been working on sublinear algorithms for matrices, i.e., algorithms that run in time less than the number of inputs. Matrices are often used to represent data and parameters of learning models, and as such large datasets and complex learning models have lead to a requirement for efficient computational algorithms. The body of my work has been on pushing the boundaries of sublinear time or sublinear query algorithms in context of matrices and their applications. Applications include – model compression, approximate learning functions, fast search and recommendation systems, near-optimal deep nets, and approximate feature embeddings.

Prior to this I have dabbled unsuccessfully in computer vision, where I was working on detecting and recognizing texts in maps. I also worked on image synthesis in the kernel induced space during my masters.

Papers

(author ordering for the papers below is alphabetical if marked *)

Theses/Dissertations

Patents