Ask a Question

Prefer a chat interface with context about you and your work?

Denoising Self-Attentive Sequential Recommendation

Denoising Self-Attentive Sequential Recommendation

Transformer-based sequential recommenders are very powerful for capturing both short-term and long-term sequential item dependencies. This is mainly attributed to their unique self-attention networks to exploit pairwise item-item interactions within the sequence. However, real-world item sequences are often noisy, which is particularly true for implicit feedback. For example, a large …