作者: Hazem Hajj , Reem A. Mahmoud , Wissam Antoun , Tarek Naous
DOI:
关键词: Empathy 、 Arabic 、 BLEU 、 Transformer (machine learning model) 、 Perplexity 、 Leverage (statistics) 、 Natural language processing 、 Knowledge transfer 、 Natural language generation 、 Language model 、 Natural language understanding 、 Computer science 、 Artificial intelligence
摘要: Enabling empathetic behavior in Arabic dialogue agents is an important aspect of building human-like conversational models. While Natural Language Processing has seen significant advances Understanding (NLU) with language models such as AraBERT, Generation (NLG) remains a challenge. The shortcomings NLG encoder-decoder are primarily due to the lack datasets suitable train agents. To overcome this issue, we propose transformer-based initialized AraBERT parameters. By initializing weights encoder and decoder pre-trained weights, our model was able leverage knowledge transfer boost performance response generation. enable empathy model, it using ArabicEmpatheticDialogues dataset achieve high Specifically, achieved low perplexity value 17.0 increase 5 BLEU points compared previous state-of-the-art model. Also, proposed rated highly by 85 human evaluators, validating its capability exhibiting while generating relevant fluent responses open-domain settings.