Final published version
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Attention-based Convolutional and Recurrent Neural Networks
T2 - SIGNLL Conference on Computational Natural Language Learning
AU - Blohm, Matthias
AU - Jagfeld, Glorianna
AU - Sood, Ekta
AU - Yu, Xiang
AU - Vu, Ngoc Thang
N1 - CoNLL 2018
PY - 2018/10
Y1 - 2018/10
N2 - We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.
AB - We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.
M3 - Conference contribution/Paper
SP - 108
EP - 118
BT - Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018)
PB - Association for Computational Linguistics
CY - Stroudsburg PA, USA
Y2 - 31 October 2018 through 1 November 2018
ER -