Exploring the Use of Synthetic Gradients for Distributed Deep Learning across Cloud and Edge Resources

Authors: 

Yitao Chen, Kaiqi Zhao, Baoxin Li, and Ming Zhao, Arizona State University

Abstract: 

With the explosive growth of data, largely contributed by the rapidly and widely deployed smart devices on the edge, we need to rethink the training paradigm for learning on such realworld data. The conventional cloud-only approach can hardly keep up with the computational demand from these deep learning tasks; and the traditional back propagation based training method also makes it difficult to scale out the training. Fortunately, the continuous advancement in System on Chip (SoC) hardware is transforming edge devices into capable computing platforms, and can potentially be exploited to address these challenges. These observations have motivated this paper’s study on the use of synthetic gradients for distributed training cross cloud and edge devices. We employ synthetic gradients into various neural network models to comprehensively evaluate its feasibility in terms of accuracy and convergence speed. We distribute the training of the various layers of a model using synthetic gradients, and evaluate its effectiveness on the edge by using resource-limited containers to emulate edge devices. The evaluation result shows that the synthetic gradient approach can achieve comparable accuracy compared to the conventional back propagation, for an eight-layer model with both fully-connected and convolutional layers. For a more complex model (VGG16), the training suffers from some accuracy degradation (up to 15%). But it achieves 11% improvement in training speed when the layers of a model are decoupled and trained on separate resource-limited containers, compared to the training of the whole model using the conventional method on the physical machine.

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.

BibTeX
@inproceedings {234813,
author = {Yitao Chen and Kaiqi Zhao and Baoxin Li and Ming Zhao},
title = {Exploring the Use of Synthetic Gradients for Distributed Deep Learning across Cloud and Edge Resources},
booktitle = {2nd USENIX Workshop on Hot Topics in Edge Computing (HotEdge 19)},
year = {2019},
address = {Renton, WA},
url = {https://www.usenix.org/conference/hotedge19/presentation/chen},
publisher = {USENIX Association},
month = jul
}