Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Sequence-to-Sequence Models for Data-to-Text Natural Language Generation
T2 - 11th International Conference on Natural Language Generation (INLG)
AU - Jagfeld, Glorianna
AU - Jenne, Sabrina
AU - Vu, Ngoc Thang
PY - 2018/11
Y1 - 2018/11
N2 - We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions.Subsequent detailed statistical and human analyses shed light on the differencesbetween the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.
AB - We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions.Subsequent detailed statistical and human analyses shed light on the differencesbetween the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.
M3 - Conference contribution/Paper
SN - 9781948087865
SP - 221
EP - 232
BT - Proceedings of the 11th International Natural Language Generation Conference
PB - Association for Computational Linguistics
Y2 - 5 November 2018 through 8 November 2018
ER -