Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Light
Dark
System
Text Understanding with the Attention Sum Reader Network
Rudolf Kadlec
,
Martin Schmid
,
Ondřej Bajgar
,
Jan Kleindienst
Type:
Preprint
Publication Date:
2016-03-04
Citations:
32
View Publication
Share
Locations
arXiv (Cornell University) -
View
Similar Works
Action
Title
Year
Authors
+
Text Understanding with the Attention Sum Reader Network
2016
Rudolf Kadlec
Martin Schmid
Ondřej Bajgar
Jan Kleindienst
+
PDF
Chat
Text Understanding with the Attention Sum Reader Network
2016
Rudolf Kadlec
Martin Schmid
Ondřej Bajgar
Jan Kleindienst
+
Gated-Attention Readers for Text Comprehension
2016
Bhuwan Dhingra
Hanxiao Liu
Zhilin Yang
William W. Cohen
Ruslan Salakhutdinov
+
Gated-Attention Readers for Text Comprehension
2016
Bhuwan Dhingra
Hanxiao Liu
Zhilin Yang
William W. Cohen
Ruslan Salakhutdinov
+
PDF
Chat
Gated-Attention Readers for Text Comprehension
2017
Bhuwan Dhingra
Hanxiao Liu
Zhilin Yang
William W. Cohen
Ruslan Salakhutdinov
+
PDF
Chat
Attention-over-Attention Neural Networks for Reading Comprehension
2017
Yiming Cui
Zhipeng Chen
Si Wei
Shijin Wang
Ting Liu
Guoping Hu
+
Fast Reading Comprehension with ConvNets
2017
Felix Wu
Ni Lao
John Blitzer
Guandao Yang
Kilian Q. Weinberger
+
Sequential Attention: A Context-Aware Alignment Function for Machine Reading
2017
Sebastian Brarda
Philip Yeres
Samuel R. Bowman
+
Sequential Attention: A Context-Aware Alignment Function for Machine Reading
2017
Sebastian Brarda
Philip Yeres
Samuel R. Bowman
+
Contextualized Word Representations for Reading Comprehension
2017
Shimi Salant
Jonathan Berant
+
Contextualized Word Representations for Reading Comprehension
2017
Shimi Salant
Jonathan Berant
+
Sequential Attention: A Context-Aware Alignment Function for Machine Reading
2017
Sebastian Brarda
Philip Yeres
Samuel Bowman
+
Iterative Alternating Neural Attention for Machine Reading
2016
Alessandro Sordoni
Philip Bachman
Adam Trischler
Yoshua Bengio
+
Sequential Attention
2017
Sebastian Brarda
Philip Yeres
Samuel R. Bowman
+
Broad Context Language Modeling as Reading Comprehension
2017
Zewei Chu
Hai Wang
Kevin Gimpel
David McAllester
+
Broad Context Language Modeling as Reading Comprehension
2016
Zewei Chu
Hai Wang
Kevin Gimpel
David McAllester
+
Natural Language Comprehension with the EpiReader
2016
Adam Trischler
Zheng Ye
Xingdi Yuan
Kaheer Suleman
+
Natural Language Comprehension with the EpiReader
2016
Adam Trischler
Zheng Ye jeff.ye
Xingdi Yuan
Kaheer Suleman
+
FAT ALBERT: Finding Answers in Large Texts using Semantic Similarity Attention Layer based on BERT
2020
Omar S. Mossad
Amgad Ahmed
Anandharaju Raju
Hari Karthikeyan
Zayed Ahmed
+
Dual Ask-Answer Network for Machine Reading Comprehension
2018
Hang Xiao
Feng Wang
Jianfeng Yan
Jingyao Zheng
Works That Cite This (32)
Action
Title
Year
Authors
+
NewsQA: A Machine Comprehension Dataset
2016
Adam Trischler
Tong Wang
Xingdi Yuan
Justin Harris
Alessandro Sordoni
Philip Bachman
Kaheer Suleman
+
Machine Comprehension by Text-to-Text Neural Question Generation
2017
Xingdi Yuan
Tong Wang
Çaǧlar Gülçehre
Alessandro Sordoni
Philip Bachman
Sandeep Subramanian
Saizheng Zhang
Adam Trischler
+
Densely Connected Attention Propagation for Reading Comprehension
2018
Yi Tay
Luu Anh Tuan
Siu Cheung Hui
Jian Su
+
Simple and Effective Multi-Paragraph Reading Comprehension
2017
Christopher Clark
Matt Gardner
+
Multi-Mention Learning for Reading Comprehension with Neural Cascades
2017
Swabha Swayamdipta
Ankur P. Parikh
Tom Kwiatkowski
+
Simple and Effective Curriculum Pointer-Generator Networks for Reading Comprehension over Long Narratives
2019
Yi Tay
Shuohang Wang
Luu Anh Tuan
Jie Fu
Minh C. Phan
Xingdi Yuan
Jinfeng Rao
Siu Cheung Hui
Aston Zhang
+
Improving Neural Language Models with a Continuous Cache
2016
Édouard Grave
Armand Joulin
Nicolas Usunier
+
Semi-interactive Attention Network for Answer Understanding in Reverse-QA
2019
Qing Yin
Guan Luo
Xiaodong Zhu
Qinghua Hu
Ou Wu
+
Multi-Perspective Context Matching for Machine Comprehension
2016
Zhiguo Wang
Haitao Mi
Wael Hamza
Radu Florian
+
Natural Language Comprehension with the EpiReader
2016
Adam Trischler
Zheng Ye jeff.ye
Xingdi Yuan
Kaheer Suleman
Works Cited by This (7)
Action
Title
Year
Authors
+
Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
2014
Jun‐Young Chung
Çaǧlar Gülçehre
Kyunghyun Cho
Yoshua Bengio
+
Blocks and Fuel: Frameworks for deep learning
2015
Bart van Merriënboer
Dzmitry Bahdanau
Vincent Dumoulin
Dmitriy Serdyuk
David Warde-Farley
Jan Chorowski
Yoshua Bengio
+
The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations
2015
Felix Hill
Antoine Bordes
Sumit Chopra
Jason Weston
+
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
2016
Danqi Chen
Jason Bolton
Christopher D. Manning
+
Teaching Machines to Read and Comprehend
2015
Karl Moritz Hermann
Tomáš Kočiský
Edward Grefenstette
Lasse Espeholt
Will Kay
Mustafa Suleyman
Phil Blunsom
+
Adam: A Method for Stochastic Optimization
2014
Diederik P. Kingma
Jimmy Ba
+
Neural Machine Translation by Jointly Learning to Align and Translate
2015
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio