I am a third year M.S/Ph.D in the School of Electrical and Computer Engineering specializing in machine learning and artificial intelligence at Purdue Univesity advised by Dr. Christopher Brinton. I have also had the opportunity of actively collaborating with Dr. Qiang Qiu, Dr. Saurabh Bagchi, and Dr. Seyyedali Hosseinalipour.
I received my Bachelor’s degree in Electrical and Electronics Engineering from the National Institute of Technology Karnataka in 2015. My Bachelor’s thesis focused on “Supervised and Unsupervised Techniques for Image Segmentation” advised by Dr. Ashvini Chaturvedi. During my undergrad, I also collaborated with Dr. K Manjunatha Sharma on several miniprojects to develop smart-switches using signal processing and ML techniques.
Prior to joining Purdue, I worked for a year as a Research Scientist at a California-based AI-solutions startup Foundation AI and for over 3 years as Data Scientist at a Bangalore-based healthcare startup Practo. During my time in the industry, I have built scalable ML solutions deployed in production tackling real-world problems by leveraging advances in computer vision (CV), Natural Language Processing (NLP) and Deep Learning (DL). The novel solutions I built also led to academic publications, specifically in the field of document classification and low-latency information retrieval (details listed in Experience section).
My research interest lies at the intersection of Representation Learning, Density Estimation, and Optimization. One of the main focuses of my work is to reduce the dependence on a large labeled dataset in model training by developing algorithms that are predominantly unsupervised, self-supervised. I also have an interest in the design and analysis of representation learning algorithms that are scalable, communication-efficient, memory-efficient, and robust to adversarial perturbations. I have recently started treading the waters in deep reinforcement learning and its integration with unsupervised learning. More on that soon!
- May, 2022
- Selected as 1 among the 8 AI/ML Residents at Apple for the year 2022.
- Successfully defended M.S. thesis: “Towards Privacy and Communication Efficiency in Distributed Representation Learning”
- We submitted our paper “Efficient Federated Domain Translation” to NeurIPS, 2022
- March, 2022
- January, 2022
- Our paper “Recycling Model Updates in Federated Learning: Are Gradient Subspaces Low-Rank?” got accepted at the International Conference on Learning Representations (ICLR), 2022.
- Our paper “Can we Generalize and Distribute Private Representation Learning?” got accepted at the International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.