Ask a Question

Prefer a chat interface with context about you and your work?

WDMoE: Wireless Distributed Large Language Models with Mixture of Experts

WDMoE: Wireless Distributed Large Language Models with Mixture of Experts

Large Language Models (LLMs) have achieved significant success in various natural language processing tasks, but how wireless communications can support LLMs has not been extensively studied. In this paper, we propose a wireless distributed LLMs paradigm based on Mixture of Experts (MoE), named WDMoE, deploying LLMs collaboratively across edge servers …