About me

Hi, I am a postdoctoral fellow at the Carl R. Woese Institute for Genomic Biology (IGB) at the University of Illinois Urbana-Champaign (UIUC). I am currently working in genomic security and privacy with Prof. Carl Gunter and Prof. Olgica Milenkovic. Previously, I completed my MS and PhD degrees in the Ming Hsieh Department of Electrical and Computer Engineering at the University of Southern California, working under the guidance of Prof. Salman Avestimehr in the Information Theory and Machine Learning (vITAL) research lab. During my research pursuits, I have also collaborated closely with Prof. Murali Annavaram, Prof. Keith Chugg, and Prof. Ramtin Pedarsani. I have also had the fortune to gain industry experience through multiple internships. I spent Summer 2018 and Summer 2019 as a Research Intern at Intel Labs under Dr. Shilpa Talwar and Dr. Nageen Himayat respectively. During Summer 2021, I was an Applied Scientist Intern at Amazon Alexa AI under Dr. Clement Chung and Dr. Rahul Gupta. Prior to joining the graduate school, I completed my BTech in 2016 in Electrical Engineering from the Indian Institute of Technology Kanpur, where I worked under Prof. Aditya K. Jagannatham, in the Multimedia Wireless Networks (MWN) Group.

Outside research, I like hanging out with friends, watching classical Bollywood movies, and listening to Indian classical music.

(CV available upon request)

Interests
  • Distributed Optimization and Learning
  • Security and Privacy in Data Analytics
  • Biological Data Modeling and Analysis
  • Information and Coding Theory
Education
  • MS/PhD in Electrical and Computer Engineering, 2022

    University of Southern California

  • BTech in Electrical Engineering, 2016

    Indian Institute of Technology Kanpur

Professional
Experience

 
 
 
 
 
Postdoctoral Fellow
Carl R. Woese Institute for Genomic Biology (IGB), University of Illinois Urbana-Champaign (UIUC)
Aug 2023 – Present Urbana, IL
 
 
 
 
 
Postdoctoral Researcher
Coordinated Science Laboratory (CSL), University of Illinois Urbana-Champaign (UIUC)
Jul 2022 – Jul 2023 Urbana, IL
 
 
 
 
 
Graduate Research Assistant
Information Theory and Machine Learning (vITAL) Lab, University of Southern California (USC)
Aug 2016 – May 2022 Los Angeles, CA
 
 
 
 
 
Applied Scientist Intern
Alexa AI, Amazon
Jun 2021 – Aug 2021 Cambridge, MA
 
 
 
 
 
Graduate Technical Intern
Wireless Communication Research, Intel Labs
May 2019 – Aug 2019 Santa Clara, CA
 
 
 
 
 
Graduate Technical Intern
Wireless Communication Research, Intel Labs
May 2018 – Aug 2018 Santa Clara, CA
 
 
 
 
 
International Visiting Student
IUSSTF-Viterbi Program, USC
May 2015 – Jul 2015 Los Angeles, CA
 
 
 
 
 
Undergraduate Research Assistant
Multimedia Wireless Networks (MWN) Group, IIT Kanpur
Aug 2013 – May 2016 Kanpur, India
 
 
 
 
 
Undergraduate Research Intern
Summer Undergraduate Research Grant for Excellence (SURGE), IIT Kanpur
May 2013 – Jul 2013 Kanpur, India

Projects

*
Federated Classification in Hyperbolic Spaces via Secure Aggregation of Convex Hulls

Federated Classification in Hyperbolic Spaces via Secure Aggregation of Convex Hulls

Proposed the first approach to enable privacy-preserving classification in hyperbolic geometry in the federated setting.

Machine Unlearning of Federated Clusters

Machine Unlearning of Federated Clusters

Proposed the first known unlearning mechanism for federated clustering with privacy criteria that support simple, provable, and efficient data removal at the client and server level.

Lottery Aware Sparse Federated Learning

Lottery Aware Sparse Federated Learning

Presented methodologies for sparse federated learning for resource constrained edge (both homogeneous and heterogeneous compute budget).

Resource-Constrained Federated Learning of Large Models

Resource-Constrained Federated Learning of Large Models

Provided a sub-model training method that enabled resource-constrained clients to train large models in federated learning settings.

Secure and Fault Tolerant Decentralized Learning

Secure and Fault Tolerant Decentralized Learning

Proposed a novel sampling based approach that applies per client criteria for mitigating faults in the general federated learning setting.

Secure Large-Scale Serveless Training at the Edge

Secure Large-Scale Serveless Training at the Edge

Developed a fast and computationally efficient Byzantine robust algorithm that leverages a sequential, memory assisted and performance criteria for training over a logical ring.

Fast and Robust Large-Scale Distributed Gradient Descent

Fast and Robust Large-Scale Distributed Gradient Descent

Developed a practical algorithm for distributed learning that is both communication efficient and straggler-resilient.

Low-Latency Federated Learning in Wireless Edge Networks

Low-Latency Federated Learning in Wireless Edge Networks

Proposed CodedFedL that injects structured coding redundancy into non-linear federated learning for mitigating stragglers and speeding up training procedure in heterogeneous MEC networks.

Hierarchical Decentralized Training at the Edge

Hierarchical Decentralized Training at the Edge

Formulated a problem for decentralized training from data at the edge users, incorporating the challenges of straggling communications and limited communication bandwidth.

Communciation Efficient Large-Scale Graph Processing

Communciation Efficient Large-Scale Graph Processing

Proposed and implemented a practical MapReduce based approach for large-scale graph processing.

Pre-defined Sparsity for Convolutional Neural Networks

Pre-defined Sparsity for Convolutional Neural Networks

Proposed the first approach to reduce footprint of convolutional neural networks via pre-defined sparsity.

Optimal Resource Allocation for Cloud Computing

Optimal Resource Allocation for Cloud Computing

Developed an efficient approach for load allocation in heterogeneous cloud clusters.

Publications

Quickly discover relevant content by filtering publications.

Journals

(2022). Basil: A Fast and Byzantine-Resilient Approach for Decentralized Training. In IEEE Journal on Selected Areas in Communications.

PDF Cite DOI

(2021). CodedReduce: A Fast and Robust Framework for Gradient Aggregation in Distributed Learning. In IEEE/ACM Transactions on Networking.

PDF Cite DOI

(2020). Coded computing for low-latency federated learning over wireless edge networks. In IEEE Journal on Selected Areas in Communications.

PDF Cite DOI

(2020). Coded computing for distributed graph analytics. In IEEE Transactions on Information Theory.

PDF Cite DOI

(2019). Coded computation over heterogeneous clusters. In IEEE Transactions on Information Theory.

PDF Cite DOI

Conferences

(2023). Machine Unlearning of Federated Clusters. In ICLR.

PDF Cite

(2020). Hierarchical coded gradient aggregation for learning at the edge. In ISIT.

PDF Cite DOI

(2019). pSConv: A pre-defined sparse kernel based convolution for deep CNNs. In Allerton.

PDF Cite DOI

(2019). Tree gradient coding. In ISIT.

PDF Cite DOI

(2018). Coded computing for distributed graph analytics. In ISIT.

PDF Cite DOI

(2017). Coded computation over heterogeneous clusters. In ISIT.

PDF Cite DOI

Other
Proceedings

Workshops

(2020). Coded federated learning. In Globecom.

PDF Cite DOI

(2019). Coded computing for distributed machine learning in wireless edge network. In VTC.

PDF Cite DOI

Selected
Talks

Trustworthy, Efficient, and Robust Distributed Systems
Taming Heterogeneity, the Ubiquitous Beast in Cloud Computing and Decentralized Learning
TEE-GPU Cooperative Learning: Privacy and Security Without the Price
Federated deep learning: On-device learning for CV and NLP
Coded Computing for Federated Learning at the Edge

Selected
Awards

Qualcomm Innovation Fellowship
Most Novel Research Project Award
Annenberg PhD Fellowship

Contact