A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning

Type: Preprint

Publication Date: 2020-01-01

Citations: 0

DOI: https://doi.org/10.48550/arxiv.2006.11077

View

Locations

  • arXiv (Cornell University) - View - PDF
  • King Abdullah University of Science and Technology Repository (King Abdullah University of Science and Technology) - View - PDF
  • DataCite API - View

Similar Works

Action Title Year Authors
+ A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning 2020 Samuel Horváth
Peter Richtárik
+ EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization 2022 Laurent Condat
Kai Yi
Peter Richtárik
+ Error Compensated Distributed SGD Can Be Accelerated 2020 Xun Qian
Peter Richtárik
Tong Zhang
+ Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees 2020 Constantin Philippenko
Aymeric Dieuleveut
+ On Biased Compression for Distributed Learning 2020 Aleksandr Beznosikov
Samuel Horváth
Peter Richtárik
Mher Safaryan
+ Analysis of Error Feedback in Federated Non-Convex Optimization with Biased Compression 2022 Xiaoyun Li
Ping Li
+ PDF Chat Accelerated Methods with Compression for Horizontal and Vertical Federated Learning 2024 Sergey Stanko
Timur Karimullin
Aleksandr Beznosikov
Alexander Gasnikov
+ Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees 2020 Constantin Philippenko
Aymeric Dieuleveut
+ Artemis: tight convergence guarantees for bidirectional compression in Federated Learning. 2020 Constantin Philippenko
Aymeric Dieuleveut
+ Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression 2022 Xinmeng Huang
Yiming Chen
Wotao Yin
Kun Yuan
+ PDF Chat Accelerated Methods with Compressed Communications for Distributed Optimization Problems under Data Similarity 2024 Dmitry Bylinkin
Aleksandr Beznosikov
+ Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization 2020 Zhize Li
Dmitry Kovalev
Xun Qian
Peter Richtárik
+ Communication Compression for Decentralized Training 2018 Hanlin Tang
Shaoduo Gan
Ce Zhang
Tong Zhang
Liu Ji
+ PDF Chat Near Optimal Decentralized Optimization with Compression and Momentum Tracking 2024 Rustem Islamov
Yuan Gao
Sebastian U. Stich
+ On Biased Compression for Distributed Learning. 2020 Aleksandr Beznosikov
Samuel Horváth
Peter Richtárik
Mher Safaryan
+ Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression 2023 Yutong He
Xinmeng Huang
Yiming Chen
Wotao Yin
Kun Yuan
+ EControl: Fast Distributed Optimization with Compression and Error Control 2023 Yuan Gao
Rustem Islamov
Sebastian U. Stich
+ Compressed-VFL: Communication-Efficient Learning with Vertically Partitioned Data 2022 Timothy Castiglia
Anirban Das
Shiqiang Wang
Stacy Patterson
+ PDF Chat Communication Compression for Distributed Learning without Control Variates 2024 Tomàs Ortega
Chun-Hsiang Huang
Xiaoxiao Li
Hamid Jafarkhani
+ Faster Rates for Compressed Federated Learning with Client-Variance Reduction 2021 Haoyu Zhao
Konstantin Burlachenko
Zhize Li
Peter Richtárik

Cited by (0)

Action Title Year Authors

Citing (0)

Action Title Year Authors