Chebyshev Polynomials in Distributed Consensus Applications

Type: Article

Publication Date: 2012-10-23

Citations: 52

DOI: https://doi.org/10.1109/tsp.2012.2226173

Abstract

In this paper we analyze the use of Chebyshev polynomials in distributed consensus applications. It is well known that the use of polynomials speeds up the convergence to the consensus in a significant way. However, existing solutions only work for low degree polynomials and require the topology of the network to be fixed and known. We propose a distributed algorithm based on the second order difference equation that describes the Chebyshev polynomials of first kind. The contributions of our algorithm are three: (i) Since the evaluation of Chebyshev polynomials is stable, there is no limitation in the degree of the polynomial. (ii) Instead of the knowledge of the whole network topology, it only requires a partial knowledge or an approximation to it. (iii) It can be applied to time varying topologies. In the paper we characterize the main properties of the algorithm for both fixed and time-varying communication topologies. Theoretical results, as well as experiments with synthetic data, show the benefits of using our algorithm compared to existing methods.

Locations

  • IEEE Transactions on Signal Processing - View
  • arXiv (Cornell University) - View - PDF

Works That Cite This (25)

Action Title Year Authors
+ PDF Chat Fast Consensus of High-Order Multiagent Systems 2022 Jiahao Dai
Jing–Wen Yi
Li Chai
+ PDF Chat A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip 2021 Mathieu Even
Raphaël Berthier
Francis Bach
Nicolas Flammarion
Pierre Gaillard
Hadrien Hendrikx
Laurent Massoulié
Adrien Taylor
+ Average Consensus by Graph Filtering: New Approach, Explicit Convergence Rate and Optimal Design 2018 Jing–Wen Yi
Li Chai
Jingxin Zhang
+ Convergence Rate of Accelerated Average Consensus with Local Node Memory: Optimization and Analytic Solutions 2021 Jing–Wen Yi
Li Chai
Jingxin Zhang
+ PDF Chat Convergence Rate of Accelerated Average Consensus With Local Node Memory: Optimization and Analytic Solutions 2023 Jing–Wen Yi
Li Chai
Jingxin Zhang
+ PDF Chat On<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline" id="d1e1235" altimg="si7.svg"><mml:mrow><mml:mo>(</mml:mo><mml:mi>ÎČ</mml:mi><mml:mo>,</mml:mo><mml:mi>Îł</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math>-Chebyshev functions and points of the interval 2021 Stefano De Marchı
Giacomo Elefante
Francesco Marchetti
+ PDF Chat Characterizing limits and opportunities in speeding up Markov chain mixing 2021 Simon Apers
Alain Sarlette
Francesco Ticozzi
+ PDF Chat Optimal Filter Design for Consensus on Random Directed Graphs 2018 Stephen Kruzick
José M. F. Moura
+ PDF Chat Accelerating Consensus by Spectral Clustering and Polynomial Filters 2016 Simon Apers
Alain Sarlette
+ Adopted spectral tau approach for the time-fractional diffusion equation via seventh-kind Chebyshev polynomials 2024 W. M. Abd‐Elhameed
Y. H. Youssri
Ahmed Gamal Atta