Home > Research > Publications & Outputs > Attention-based Convolutional and Recurrent Neu...

Links

View graph of relations

Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date10/2018
Host publicationProceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)
Place of PublicationStroudsburg PA, USA
PublisherAssociation for Computational Linguistics
Pages108-118
Number of pages11
<mark>Original language</mark>English
EventSIGNLL Conference on Computational Natural Language Learning - Brussels, Belgium
Duration: 31/10/20181/11/2018
http://www.conll.org/2018

Conference

ConferenceSIGNLL Conference on Computational Natural Language Learning
Abbreviated titleCoNLL
Country/TerritoryBelgium
CityBrussels
Period31/10/181/11/18
Internet address

Conference

ConferenceSIGNLL Conference on Computational Natural Language Learning
Abbreviated titleCoNLL
Country/TerritoryBelgium
CityBrussels
Period31/10/181/11/18
Internet address

Abstract

We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.

Bibliographic note

CoNLL 2018