Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Light
Dark
System
Recurrent Dropout without Memory Loss
Stanislau Semeniuta
,
Aliaksei Severyn
,
Erhardt Barth
Type:
Preprint
Publication Date:
2016-03-16
Citations:
96
View Publication
Share
Locations
arXiv (Cornell University) -
View
Similar Works
Action
Title
Year
Authors
+
Recurrent Dropout without Memory Loss
2016
Stanislau Semeniuta
Aliaksei Severyn
Erhardt Barth
+
PDF
Chat
Recurrent Dropout without Memory Loss
2017
Stanislau Semeniuta
Aliaksei Severyn
Erhardt Barth
+
Recurrent Neural Network Regularization
2014
Wojciech Zaremba
Ilya Sutskever
Oriol Vinyals
+
Fraternal Dropout
2017
Konrad Żołna
Devansh Arpit
Dendi Suhubdy
Yoshua Bengio
+
Fraternal Dropout
2017
Konrad Żołna
Devansh Arpit
Dendi Suhubdy
Yoshua Bengio
+
Revisiting Activation Regularization for Language RNNs
2017
Stephen Merity
Bryan McCann
Richard Socher
+
Dropout improves Recurrent Neural Networks for Handwriting Recognition
2013
Vu Pham
Théodore Bluche
Christopher Kermorvant
Jérôme Louradour
+
Dropout improves Recurrent Neural Networks for Handwriting Recognition
2013
Vu Pham
Théodore Bluche
Christopher Kermorvant
Jérôme Louradour
+
PDF
Chat
Dropout Improves Recurrent Neural Networks for Handwriting Recognition
2014
Vu Pham
Théodore Bluche
Christopher Kermorvant
Jérôme Louradour
+
PDF
Chat
Adversarial Dropout for Recurrent Neural Networks
2019
Sungrae Park
Kyungwoo Song
Mingi Ji
Wonsung Lee
Il‐Chul Moon
+
Adversarial Dropout for Recurrent Neural Networks
2019
Sungrae Park
Kyungwoo Song
Mingi Ji
Wonsung Lee
Il‐Chul Moon
+
Adversarial Dropout for Recurrent Neural Networks
2019
Sungrae Park
Kyungwoo Song
Mingi Ji
Wonsung Lee
Il‐Chul Moon
+
Revisiting Structured Dropout
2022
Yiren Zhao
Oluwatomisin Dada
Xitong Gao
Robert Mullins
+
Dilated Recurrent Neural Networks
2017
Shiyu Chang
Yang Zhang
Wei Han
Mo Yu
Xiaoxiao Guo
Wei Tan
Xiaodong Cui
Michael Witbrock
Mark Hasegawa–Johnson
Thomas S. Huang
+
PDF
Chat
Recurrent Memory Networks for Language Modeling
2016
Ke Tran
Arianna Bisazza
Christof Monz
+
From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information
2018
Hengru Xu
Li Shen
Renfen Hu
Li Si
Sheng Gao
+
From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information
2018
Hengru Xu
Li Shen
Renfen Hu
Si Li
Sheng Gao
+
Recurrent Memory Networks for Language Modeling
2016
Ke Tran
Arianna Bisazza
Christof Monz
+
Recurrent Memory Network for Language Modeling.
2016
Ke Tran
Arianna Bisazza
Christof Monz
+
Recurrent Neural Networks and Long Short-Term Memory Networks: Tutorial and Survey
2023
Benyamin Ghojogh
Ali Ghodsi
Works That Cite This (55)
Action
Title
Year
Authors
+
Scalable Bayesian Learning of Recurrent Neural Networks for Language Modeling
2016
Zhe Gan
Chunyuan Li
Changyou Chen
Yunchen Pu
Qinliang Su
Lawrence Carin
+
Recurrent Memory Array Structures.
2016
Kamil Rocki
+
Pushing the bounds of dropout
2018
Gábor Melis
Charles Blundell
Tomáš Kočiský
Karl Moritz Hermann
Chris Dyer
Phil Blunsom
+
Multilingual Training and Cross-lingual Adaptation on CTC-based Acoustic Model
2017
Sibo Tong
Philip N. Garner
Hervé Bourlard
+
Twin Networks: Matching the Future for Sequence Generation
2017
Dmitriy Serdyuk
Nan Rosemary Ke
Alessandro Sordoni
Adam Trischler
Chris Pal
Yoshua Bengio
+
Improving the Neural GPU Architecture for Algorithm Learning
2017
Kārlis Freivalds
Renārs Liepiņš
+
Hierarchical Temporal Convolutional Networks for Dynamic Recommender Systems
2019
Jiaxuan You
Yichen Wang
Aditya Pal
Pong Eksombatchai
Chuck Rosenberg
Jure Leskovec
+
Shifting Mean Activation Towards Zero with Bipolar Activation Functions
2017
Lars Hiller Eidnes
Arild Nøkland
+
Recent Advances in Recurrent Neural Networks
2018
Hojjat Salehinejad
Julianne Baarbé
Sharan Sankar
Joseph Barfett
Errol Colak
Shahrokh Valaee
+
DTMT: A Novel Deep Transition Architecture for Neural Machine Translation
2018
Fandong Meng
Jinchao Zhang
Works Cited by This (14)
Action
Title
Year
Authors
+
Recurrent Neural Network Regularization
2014
Wojciech Zaremba
Ilya Sutskever
Oriol Vinyals
+
Improving neural networks by preventing co-adaptation of feature detectors
2012
Geoffrey E. Hinton
Nitish Srivastava
Alex Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
+
Visualizing and Understanding Recurrent Networks
2015
Andrej Karpathy
Justin Johnson
Li Fei-Fei
+
Dropout improves Recurrent Neural Networks for Handwriting Recognition
2013
Vu Pham
Théodore Bluche
Christopher Kermorvant
Jérôme Louradour
+
Learning Longer Memory in Recurrent Neural Networks
2014
Tomáš Mikolov
Armand Joulin
Sumit Chopra
Michaël Mathieu
Marc ' Aurelio Ranzato
+
On the Properties of Neural Machine Translation: Encoder-Decoder Approaches
2014
Kyunghyun Cho
Bart van Merriënboer
Dzmitry Bahdanau
Yoshua Bengio
+
A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
2015
Yarin Gal
Zoubin Ghahramani
+
Tree Recurrent Neural Networks with Application to Language Modeling.
2015
Xingxing Zhang
Liang Lu
Mirella Lapata
+
PDF
Chat
SemEval-2015 Task 10: Sentiment Analysis in Twitter
2015
Sara Rosenthal
Preslav Nakov
Svetlana Kiritchenko
Saif M. Mohammad
Alan Ritter
Veselin Stoyanov
+
Sequence to Sequence Learning with Neural Networks
2014
Ilya Sutskever
Oriol Vinyals
Quoc V. Le