This study explores the potential benefits of a recently proposed transformer-based multi-task Natural Language Understanding (NLU) architecture, mainly to perform Intent Recognition on small-size domain-specific educational game datasets. The evaluation datasets were collected from children practicing basic math concepts via play-based interactions in game-based learning settings.
Link to paper:
(c) European Language Resources Association (ELRA), licensed under CC-BY-NC 4.0