Not All Memories are Created Equal: Learning to Forget by Expiring

Type: Preprint

Publication Date: 2021-01-01

Citations: 11

DOI: https://doi.org/10.48550/arxiv.2105.06548

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Not All Memories are Created Equal: Learning to Forget by Expiring 2021 Sainbayar Sukhbaatar
Da Young Ju
Spencer Poff
Stephen Roller
Arthur Szlam
Jason Weston
Angela Fan
+ Memformer: A Memory-Augmented Transformer for Sequence Modeling 2020 Qingyang Wu
Zhenzhong Lan
Kun Qian
Jing Gu
Alborz Geramifard
Yu Zhou
+ Semantic HELM: A Human-Readable Memory for Reinforcement Learning 2023 Fabian Paischer
Thomas Adler
Markus Hofmarcher
Sepp Hochreiter
+ Towards mental time travel: a hierarchical memory for reinforcement learning agents 2021 Andrew K. Lampinen
Stephanie C. Y. Chan
Andrea Banino
Felix Hill
+ Towards mental time travel: a hierarchical memory for reinforcement learning agents 2021 Andrew K. Lampinen
Stephanie C. Y. Chan
Andrea Banino
Felix Hill
+ PDF Chat Titans: Learning to Memorize at Test Time 2024 Ali Behrouz
Peilin Zhong
Vahab Mirrokni
+ PDF Chat Selective Experience Replay for Lifelong Learning 2018 David Isele
Akansel Cosgun
+ Selective Experience Replay for Lifelong Learning 2018 David Isele
Akansel Cosgun
+ PDF Chat InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory 2024 Chaojun Xiao
Pengle Zhang
Xu Han
Guangxuan Xiao
Yankai Lin
Zhengyan Zhang
Zhiyuan Liu
Song Han
Maosong Sun
+ Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting 2020 Sayna Ebrahimi
Suzanne Petryk
Akash Gokul
William Gan
Joseph E. Gonzalez
Marcus Rohrbach
Trevor Darrell
+ Learning to Remember Rare Events 2017 Łukasz Kaiser
Ofir Nachum
Aurko Roy
Samy Bengio
+ PDF Chat An Evolved Universal Transformer Memory 2024 Edoardo Cetin
Qi Sun
Tianyu Zhao
Yujin Tang
+ Ring Attention with Blockwise Transformers for Near-Infinite Context 2023 Hao Liu
Matei Zaharia
Pieter Abbeel
+ PDF Chat TransformerFAM: Feedback attention is working memory 2024 Dongseong Hwang
Weiran Wang
Zhuoyuan Huo
Khe Chai Sim
Pedro Moreno Mengibar
+ Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting 2020 Sayna Ebrahimi
Suzanne Petryk
Akash Gokul
William Gan
Joseph E. Gonzalez
Marcus Rohrbach
Trevor Darrell
+ PDF Chat Expansion Span: Combining Fading Memory and Retrieval in Hybrid State Space Models 2024 Elvis Nunez
Luca Zancato
Benjamin Bowman
Aditya Golatkar
Wei Xia
Stefano Soatto
+ Meta-Learning Representations for Continual Learning 2019 Khurram Javed
Martha White
+ $\infty$-former: Infinite Memory Transformer 2021 Pedro Henrique Martins
Zita Marinho
André F. T. Martins
+ PDF Chat HMT: Hierarchical Memory Transformer for Long Context Language Processing 2024 Zifan He
Zongyue Qin
Neha Prakriya
Yizhou Sun
Jason Cong
+ PDF Chat Associative memory inspires improvements for in-context learning using a novel attention residual stream architecture 2024 Thomas Burns
Tomoki Fukai
Christopher Earls

Works Cited by This (0)

Action Title Year Authors