Home > Research > Publications & Outputs > Sequence-to-Sequence Models for Data-to-Text Na...

Links

View graph of relations

Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity. / Jagfeld, Glorianna; Jenne, Sabrina; Vu, Ngoc Thang.
Proceedings of the 11th International Natural Language Generation Conference. Association for Computational Linguistics, 2018. p. 221-232.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Jagfeld, G, Jenne, S & Vu, NT 2018, Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity. in Proceedings of the 11th International Natural Language Generation Conference. Association for Computational Linguistics, pp. 221-232, 11th International Conference on Natural Language Generation (INLG), Tilburg, Netherlands, 5/11/18. <http://aclweb.org/anthology/W18-6529>

APA

Jagfeld, G., Jenne, S., & Vu, N. T. (2018). Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity. In Proceedings of the 11th International Natural Language Generation Conference (pp. 221-232). Association for Computational Linguistics. http://aclweb.org/anthology/W18-6529

Vancouver

Jagfeld G, Jenne S, Vu NT. Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity. In Proceedings of the 11th International Natural Language Generation Conference. Association for Computational Linguistics. 2018. p. 221-232

Author

Jagfeld, Glorianna ; Jenne, Sabrina ; Vu, Ngoc Thang. / Sequence-to-Sequence Models for Data-to-Text Natural Language Generation : Word- vs. Character-based Processing and Output Diversity. Proceedings of the 11th International Natural Language Generation Conference. Association for Computational Linguistics, 2018. pp. 221-232

Bibtex

@inproceedings{fc29d11faf284594b401ea7381c54f3e,
title = "Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity",
abstract = "We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions.Subsequent detailed statistical and human analyses shed light on the differencesbetween the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.",
author = "Glorianna Jagfeld and Sabrina Jenne and Vu, {Ngoc Thang}",
year = "2018",
month = nov,
language = "English",
isbn = "9781948087865",
pages = "221--232",
booktitle = "Proceedings of the 11th International Natural Language Generation Conference",
publisher = "Association for Computational Linguistics",
note = "11th International Conference on Natural Language Generation (INLG) ; Conference date: 05-11-2018 Through 08-11-2018",

}

RIS

TY - GEN

T1 - Sequence-to-Sequence Models for Data-to-Text Natural Language Generation

T2 - 11th International Conference on Natural Language Generation (INLG)

AU - Jagfeld, Glorianna

AU - Jenne, Sabrina

AU - Vu, Ngoc Thang

PY - 2018/11

Y1 - 2018/11

N2 - We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions.Subsequent detailed statistical and human analyses shed light on the differencesbetween the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.

AB - We present a comparison of word-based and character-based sequence-to sequence models for data-to-text natural language generation, which generate natural language descriptions for structured inputs. On the datasets of two recent generation challenges, our models achieve comparable or better automatic evaluation results than the best challenge submissions.Subsequent detailed statistical and human analyses shed light on the differencesbetween the two input representations and the diversity of the generated texts. In a controlled experiment with synthetic training data generated from templates, we demonstrate the ability of neural models to learn novel combinations of the templates and thereby generalize beyond the linguistic structures they were trained on.

M3 - Conference contribution/Paper

SN - 9781948087865

SP - 221

EP - 232

BT - Proceedings of the 11th International Natural Language Generation Conference

PB - Association for Computational Linguistics

Y2 - 5 November 2018 through 8 November 2018

ER -