Ask a Question

Prefer a chat interface with context about you and your work?

On the token distance modeling ability of higher RoPE attention dimension

On the token distance modeling ability of higher RoPE attention dimension

Length extrapolation algorithms based on Rotary position embedding (RoPE) have shown promising results in extending the context length of language models. However, understanding how position embedding can capture longer-range contextual information remains elusive. Based on the intuition that different dimensions correspond to different frequency of changes in RoPE encoding, we …