Home > Research > Publications & Outputs > Scale Up Event Extraction Learning via Automati...

Electronic data

  • submission (2)

    Rights statement: Copyright c 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.

    Accepted author manuscript, 1.36 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Scale Up Event Extraction Learning via Automatic Training Data Generation

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Scale Up Event Extraction Learning via Automatic Training Data Generation. / Zeng, Ying; Feng, Yansong; Ma, Rong et al.
The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18). Palo Alto: AAAI, 2018. p. 6045-6052 (AAAI ; Vol. 32).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Zeng, Y, Feng, Y, Ma, R, Wang, Z, Yan, R, Shi, C & Zhao, D 2018, Scale Up Event Extraction Learning via Automatic Training Data Generation. in The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18). AAAI , vol. 32, AAAI, Palo Alto, pp. 6045-6052.

APA

Zeng, Y., Feng, Y., Ma, R., Wang, Z., Yan, R., Shi, C., & Zhao, D. (2018). Scale Up Event Extraction Learning via Automatic Training Data Generation. In The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18) (pp. 6045-6052). (AAAI ; Vol. 32). AAAI.

Vancouver

Zeng Y, Feng Y, Ma R, Wang Z, Yan R, Shi C et al. Scale Up Event Extraction Learning via Automatic Training Data Generation. In The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18). Palo Alto: AAAI. 2018. p. 6045-6052. (AAAI ).

Author

Zeng, Ying ; Feng, Yansong ; Ma, Rong et al. / Scale Up Event Extraction Learning via Automatic Training Data Generation. The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18). Palo Alto : AAAI, 2018. pp. 6045-6052 (AAAI ).

Bibtex

@inproceedings{20ff1278a2504ee49dcec5ae49b9bdad,
title = "Scale Up Event Extraction Learning via Automatic Training Data Generation",
abstract = "The task of event extraction has long been investigated in a supervised learning paradigm, which is bound by the number and the quality of the training instances. Existing training data must be manually generated through a combination of expert domain knowledge and extensive human involvement. However, due to drastic efforts required in annotating text, the resultant datasets are usually small, which severally affects the quality of the learned model, making it hard to generalize. Our work develops an automatic approach for generating training data for event extraction. Our approach allows us to scale up event extraction training instances from thousands to hundreds of thousands, and it does this at a much lower cost than a manual approach. We achieve this by employing distant supervision to automatically create event annotations from unlabelled text using existing structured knowledge bases or tables.We then develop a neural network model with post inference to transfer the knowledge extracted from structured knowledge bases to automatically annotate typed events with corresponding arguments in text.We evaluate our approach by using the knowledge extracted from Freebase to label texts from Wikipedia articles. Experimental results show that our approach can generate a large number of highquality training instances. We show that this large volume of training data not only leads to a better event extractor, but also allows us to detect multiple typed events.",
author = "Ying Zeng and Yansong Feng and Rong Ma and Zheng Wang and Rui Yan and Chongde Shi and Dongyan Zhao",
note = "Copyright c 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.",
year = "2018",
month = jan,
day = "1",
language = "English",
isbn = "9781577358008",
series = "AAAI ",
publisher = "AAAI",
pages = "6045--6052",
booktitle = "The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18)",

}

RIS

TY - GEN

T1 - Scale Up Event Extraction Learning via Automatic Training Data Generation

AU - Zeng, Ying

AU - Feng, Yansong

AU - Ma, Rong

AU - Wang, Zheng

AU - Yan, Rui

AU - Shi, Chongde

AU - Zhao, Dongyan

N1 - Copyright c 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.

PY - 2018/1/1

Y1 - 2018/1/1

N2 - The task of event extraction has long been investigated in a supervised learning paradigm, which is bound by the number and the quality of the training instances. Existing training data must be manually generated through a combination of expert domain knowledge and extensive human involvement. However, due to drastic efforts required in annotating text, the resultant datasets are usually small, which severally affects the quality of the learned model, making it hard to generalize. Our work develops an automatic approach for generating training data for event extraction. Our approach allows us to scale up event extraction training instances from thousands to hundreds of thousands, and it does this at a much lower cost than a manual approach. We achieve this by employing distant supervision to automatically create event annotations from unlabelled text using existing structured knowledge bases or tables.We then develop a neural network model with post inference to transfer the knowledge extracted from structured knowledge bases to automatically annotate typed events with corresponding arguments in text.We evaluate our approach by using the knowledge extracted from Freebase to label texts from Wikipedia articles. Experimental results show that our approach can generate a large number of highquality training instances. We show that this large volume of training data not only leads to a better event extractor, but also allows us to detect multiple typed events.

AB - The task of event extraction has long been investigated in a supervised learning paradigm, which is bound by the number and the quality of the training instances. Existing training data must be manually generated through a combination of expert domain knowledge and extensive human involvement. However, due to drastic efforts required in annotating text, the resultant datasets are usually small, which severally affects the quality of the learned model, making it hard to generalize. Our work develops an automatic approach for generating training data for event extraction. Our approach allows us to scale up event extraction training instances from thousands to hundreds of thousands, and it does this at a much lower cost than a manual approach. We achieve this by employing distant supervision to automatically create event annotations from unlabelled text using existing structured knowledge bases or tables.We then develop a neural network model with post inference to transfer the knowledge extracted from structured knowledge bases to automatically annotate typed events with corresponding arguments in text.We evaluate our approach by using the knowledge extracted from Freebase to label texts from Wikipedia articles. Experimental results show that our approach can generate a large number of highquality training instances. We show that this large volume of training data not only leads to a better event extractor, but also allows us to detect multiple typed events.

M3 - Conference contribution/Paper

SN - 9781577358008

T3 - AAAI

SP - 6045

EP - 6052

BT - The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18)

PB - AAAI

CY - Palo Alto

ER -