The traditional implicit aspect extraction methods require too much manual feature engineering, which are inefficient when processing large-scale data. To address the problem, BERT-based model with different kinds of classifiers is proposed. In this paper, the text are inputted into BERT to get the embeddings. Then the embeddings of the final hidden layer are combined with different classifiers to obtain implicit aspect. Five classific text classifiers are used for comparison. The experimental results show that the method in this paper outperform state-of-the-art works. The accuracy and the macro-F1 score on the SemEval implicit extraction data set can reach 78.11% and 73.19% respectively.
BERT-based implicit aspect extraction
20.10.2021
984290 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Information Extraction from Contract Based on BERT-BiLSTM-CRF
TIBKAT | 2021
|Online Contents | 1998
Online Contents | 1998
BERT for Aviation Text Classification
TIBKAT | 2023
|BERT for Aviation Text Classification
AIAA | 2023
|