Skip to Main Content
College Home Page
E C E Home Page

EE Seminars

Towards Intelligent 6G-and-Beyond Systems: Distributed Machine Learning over Heterogeneous Wireless Networks


  Add to Google Calendar
Date:  Fri, March 04, 2022
Time:  9:30am - 10:30am
Location:  Holmes 389; online available, see registration info
Speaker:  Dr. Seyyedali Hosseinalipour, Purdue University

Online available, Register here for connection info https://forms.gle/yeGtuLSFYqgbEJg86

Abstract
Recent advancements in edge devices (e.g., smart phones, unmanned aerial vehicles (UAVs), and smart cars), has allowed them to generate tremendous amount of data. This data can be used for a variety of applications, such as (i) image recognition at mobile devices, (ii) smart agriculture, and (iii) autonomous driving, the goal of which is to bring ``intelligence” to the network. In each of these applications, there is a need to train a high-performance machine learning model (e.g., using deep neural networks) using the data collected via the edge devices. To achieve this goal, conventional techniques in machine learning rely on a centralized model training architecture, where the data of the edge devices are pulled to a central location coexisting with a cloud server where the model training is performed. However, in many applications the transfer of raw data from the edge devices imposes high latency. Also, in some applications the users are not willing to share their data (e.g., due to privacy reasons). This has promoted the area of distributed machine learning, in which the model training is distributed across the network elements.

In this talk, I will first introduce the pioneering technique in distributed machine learning proposed by researchers in Google, called federated learning. Then, I will focus on one aspect of my research concerned with migrating from this architecture and extending it through three dimensions of network, heterogeneity, and proximity. I will present two alternative distributed machine learning architectures: (i) semi-decentralized federated learning, and (ii) parallel successive learning, both of which are partly inspired by federated learning but are adapted and optimized to be implemented over the next generation of heterogeneous wireless networks. I will demonstrate some novel convergence results for convex/non-convex loss functions in distributed machine learning, new network optimization formulations, and the synergies between machine learning model performance and network optimization techniques.

Bio

Seyyedali Hosseinalipour received the B.S. degree in electrical engineering from Amirkabir University of Technology, Tehran, Iran, in 2015 with high honor and top-rank recognition. He then received his M.S. and Ph.D. degrees in electrical engineering from North Carolina State University, NC, USA, in 2017 and 2020, respectively. He received both of the most prestigious awards offered at the ECE department of North Carolina State University: ECE Doctoral Scholar of the Year Award (2020) and ECE Distinguished Dissertation Award (2021).

He is currently a postdoctoral researcher at Purdue University, IN, USA, working on developing the new topic area ``Intelligent 5G/6G Networks.” His research interests include the analysis of modern wireless networks and communication systems, distributed machine learning, and optimization of next-generation intelligent networks.

He has also served as the TPC Co-Chair of workshops in IEEE INFOCOM 2021 and IEEE GLOBECOM 2021, as a TPC member of a workshop in IEEE ICC 2021, and as a program committee member of IEEE MSN.

Return to EE Seminars