Updates on the campus response to coronavirus (COVID-19)

Ph.D. Proposal Oral Exam - Afshin Abdi

Event Details

Wednesday, December 4, 2019

12:30pm - 2:30pm

Room 5234, Centergy

For More Information


Event Details

Title:  Distributed Learning and Inference in Deep Models


Dr. Fekri, Advisor

Dr. AlRegib, Chair

Dr. Romberg


The objective of the proposed research is developing new methods and analyzing their performance for distributed learning and inference of deep models, especially on nodes with limited computational power. We consider two classes of related problems; 1) distributed training of deep models, and 2) compression and restructuring of deep models for efficient deployment and reduced inference times on devices with limited resources. First, we argue that distributed training can be recast as the problem of central estimation officer (CEO) in information theory and based on it, we will develop a framework for compression, communication and parameter estimation of deep models. Next, for efficient implementations, we observe that neural networks with sparse and local connectivity structures are more suitable for extensive distribution and parallel implementation due to their lower communication requirements. Hence, we propose to restructure the neural network by rearranging neurons in each layer and partitioning the model into sub-models such that the number of connections among sub-models is minimized.

Last revised November 18, 2019