
Research Article
Event Annotation Enhanced Pre-trained Language Model in Event Extraction
@INPROCEEDINGS{10.1007/978-3-031-23902-1_22, author={Qisen Xi and Yizhi Ren and Guohua Wu and Qiuhua Wang and Lifeng Yuan and Zhen Zhang}, title={Event Annotation Enhanced Pre-trained Language Model in Event Extraction}, proceedings={Mobile Multimedia Communications. 15th EAI International Conference, MobiMedia 2022, Virtual Event, July 22-24, 2022, Proceedings}, proceedings_a={MOBIMEDIA}, year={2023}, month={2}, keywords={Event extraction Event annotation Pre-trained language model}, doi={10.1007/978-3-031-23902-1_22} }
- Qisen Xi
Yizhi Ren
Guohua Wu
Qiuhua Wang
Lifeng Yuan
Zhen Zhang
Year: 2023
Event Annotation Enhanced Pre-trained Language Model in Event Extraction
MOBIMEDIA
Springer
DOI: 10.1007/978-3-031-23902-1_22
Abstract
Event extraction is a crucial task that aims to extract event information in texts. Existing methods usually use pre-trained language models to extract events and have achieved state-of-the-art performance. However, these models do not consider the complexity of the event structure and lack the use of event knowledge. To address these problems, we propose a new framework that integrates event annotation into the pre-trained model explicitly, termd as EABERT. Specifically, event annotations are incorporated into the model input to construct the form “[CLS]sentence[SEP]event annotation[SEP]”, which allows the model to encode the semantic relationship between text and event knowledge. To incorporate appropriate event annotations into the model, we further use the bilateral-branch BERT network to train the event type classifier for better accuracy of event annotations. Experiments on the event extraction benchmark dataset (ACE 2005) show that our proposed framework has significantly improved compared to previous methods.