Ask a Question

Prefer a chat interface with context about you and your work?

Preserved central model for faster bidirectional compression in distributed settings

Preserved central model for faster bidirectional compression in distributed settings

We develop a new approach to tackle communication constraints in a distributed learning problem with a central server. We propose and analyze a new algorithm that performs bidirectional compression and achieves the same convergence rate as algorithms using only uplink (from the local workers to the central server) compression. To …