Home > Research > Publications & Outputs > Attention-based Convolutional and Recurrent Neu...

Links

View graph of relations

Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension. / Blohm, Matthias; Jagfeld, Glorianna; Sood, Ekta et al.
Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Stroudsburg PA, USA: Association for Computational Linguistics, 2018. p. 108-118.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Blohm, M, Jagfeld, G, Sood, E, Yu, X & Vu, NT 2018, Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension. in Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Association for Computational Linguistics, Stroudsburg PA, USA, pp. 108-118, SIGNLL Conference on Computational Natural Language Learning, Brussels, Belgium, 31/10/18. <http://aclweb.org/anthology/K18-1011>

APA

Blohm, M., Jagfeld, G., Sood, E., Yu, X., & Vu, N. T. (2018). Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension. In Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018) (pp. 108-118). Association for Computational Linguistics. http://aclweb.org/anthology/K18-1011

Vancouver

Blohm M, Jagfeld G, Sood E, Yu X, Vu NT. Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension. In Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Stroudsburg PA, USA: Association for Computational Linguistics. 2018. p. 108-118

Author

Blohm, Matthias ; Jagfeld, Glorianna ; Sood, Ekta et al. / Attention-based Convolutional and Recurrent Neural Networks : Success and Limitations in Machine Reading Comprehension. Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018). Stroudsburg PA, USA : Association for Computational Linguistics, 2018. pp. 108-118

Bibtex

@inproceedings{d339eb17a5584b85a98ee31991ebe8d0,
title = "Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension",
abstract = "We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.",
author = "Matthias Blohm and Glorianna Jagfeld and Ekta Sood and Xiang Yu and Vu, {Ngoc Thang}",
note = "CoNLL 2018; SIGNLL Conference on Computational Natural Language Learning, CoNLL ; Conference date: 31-10-2018 Through 01-11-2018",
year = "2018",
month = oct,
language = "English",
pages = "108--118",
booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)",
publisher = "Association for Computational Linguistics",
url = "http://www.conll.org/2018",

}

RIS

TY - GEN

T1 - Attention-based Convolutional and Recurrent Neural Networks

T2 - SIGNLL Conference on Computational Natural Language Learning

AU - Blohm, Matthias

AU - Jagfeld, Glorianna

AU - Sood, Ekta

AU - Yu, Xiang

AU - Vu, Ngoc Thang

N1 - CoNLL 2018

PY - 2018/10

Y1 - 2018/10

N2 - We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.

AB - We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.

M3 - Conference contribution/Paper

SP - 108

EP - 118

BT - Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)

PB - Association for Computational Linguistics

CY - Stroudsburg PA, USA

Y2 - 31 October 2018 through 1 November 2018

ER -