Improving NLU Training over Linked Data with Placeholder Concepts

Publikationstyp
Konferenz
Autor(en)
T. Schmitt, C. Kulbach, Y. Sure-Vetter
Jahr
2019
Adresse
Karlsruhe
Abstract
Conversational systems, also known as dialogue systems, have become increasingly popular. They can perform a variety of tasks e.g. in B2C areas such as sales and customer services. A significant amount of research has already been conducted on improving the underlying algorithms of the natural language understanding (NLU) component of dialogue systems. This paper presents an approach to generate training datasets for the NLU component from Linked Data resources. We analyze how differently designed training datasets can impact the performance of the NLU component. As core contribution we introduce and evaluate the performance of different placeholder concepts. Our results show that a trained model with placeholder concepts is capable of handling dynamic Linked Data without retraining the NLU component. Thus, our approach also contributes to the robustness of the NLU component.
Forschungsfelder
Wissensmanagement und Social Media für Enterprise 2.0
Download .bib
Download .bib
Eingetragen von
Cedric Kulbach