Skip to Main content Skip to Navigation
Conference papers

Using Multimodal Information to Enhance Addressee Detection in Multiparty Interaction

Abstract : Addressee detection is an important challenge to tackle in order to improve dialogical interactions between humans and agents. This detection, essential for turn-taking models, is a hard task in multiparty conditions. Rule based as well as statistical approaches have been explored. Statistical approaches, particularly deep learning approaches, require a huge amount of data to train. However, smart feature selection can help improve addressee detection on small datasets, particularly if multimodal information is available. In this article, we propose a statistical approach based on smart feature selection that exploits contextual and multimodal information for addressee detection. The results show that our model outperforms an existing baseline.
Document type :
Conference papers
Complete list of metadata

Cited literature [29 references]  Display  Hide  Download
Contributor : Julien Saunier Connect in order to contact the contributor
Submitted on : Thursday, May 2, 2019 - 2:53:53 PM
Last modification on : Wednesday, March 2, 2022 - 10:10:11 AM


Files produced by the author(s)


  • HAL Id : hal-02117658, version 1


Usman Malik, Mukesh Barange, Julien Saunier, Alexandre Pauchet. Using Multimodal Information to Enhance Addressee Detection in Multiparty Interaction. International Conference on Agents and Artificial Intelligence, Feb 2019, Prague, Czech Republic. pp.267-274. ⟨hal-02117658⟩



Record views


Files downloads