Research

Research Interest

Broadly I am interested in approximating large matrices. These methods, applicable and prevalent in literature, have become especially relevant in today’s world of million sized datasets and applications including but not limited to NLP. Specifically I am interested in designing approximations to matrices in sublinear time, i.e. algorithms that requires number of operations greater than the size of a dataset but lesser than the square of the size of a dataset. Applications include the study of quadratic forms and differential equations. On similar lines, a particular area of interest for me is Kernel approximation. A kernel is often a PSD matrix (and sometimes indefinte like reproducing Krien spaces) which stores inner product of data points in a dataset. These inner products can be in the input space but often is in a high dimensional space induced by a mapping function. Thus computations for big datsets are again non-trivial. Approximating them when such inducing functions are changing or the size of dataset growing is of interest to the community (like GDPR). My overarching research goal is to find solutions to these problems using techniques borrowed from linear algebra and functional analysis.

Prior to this I have dabbled unsuccessfully in computer vision, where I was working on detecting and recognizing texts in maps. I also worked on image synthesis in the kernel induced space during my masters.

Papers

(author ordering for the papers below is alphabetical if marked *)

Patents