In Cross-device Federated Learning, communication efficiency is of paramount importance. Sparse Ternary Compression (STC) is one of the most effective techniques for considerably reducing the per-round communication cost of Federated Learning (FL) without significantly degrading the accuracy of the global model, by using ternary quantization in series to topk sparsification. In this paper, we propose an original variant of STC that is specifically designed and implemented for convolutional layers. Our variant is originally based on the experimental evidence that a pattern exists in the distribution of client updates, namely, the difference between the received global model and the locally trained model. In particular, we have experimentally found that the largest (in absolute value) updates for convolutional layers tend to form clusters in a kernel-wise fashion. Therefore, our primary novel idea is to a-priori restrict the elements of STC updates to lay on such a structured pattern, thus allowing us to further reduce the STC communication cost. We have designed, implemented, and evaluated our novel technique, called Structured Sparse Ternary Compression (SSTC). Reported experimental results show that SSTC shrinks compressed updates by a factor of x3 with respect to traditional STC and with a reduction up to x104 with respect to uncompressed FedAvg, at the expense of negligible degradation of the global model accuracy.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Structured Sparse Ternary Compression for Convolutional Layers in Federated Learning


    Contributors:


    Publication date :

    2022-06-01


    Size :

    4889138 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Robust Visual Tracking via Structured Multi-Task Sparse Learning

    Zhang, T. / Ghanem, B. / Liu, S. et al. | British Library Online Contents | 2013


    Symbolic Task Compression in Structured Task Learning

    Saveriano, Matteo / Seegerer, Michael / Caccavale, Riccardo et al. | German Aerospace Center (DLR) | 2019

    Free access


    Analyzing Convergence Aspects of Federated Learning: More Devices or More Network Layers?

    Khan, Fazal Muhammad Ali / Hassan, Syed Ali / Ansari, Rafay Iqbal et al. | IEEE | 2022


    Lottery Hypothesis based Unsupervised Pre-training for Model Compression in Federated Learning

    Itahara, Sohei / Nishio, Takayuki / Morikura, Masahiro et al. | IEEE | 2020