From images to social interactions understanding

Par Catherine Pelachaud, 10 décembre, 2020

to analyze group interaction. We will rely on existing data (images and
videos) of group interaction that have been annotated at different levels (activity, speaking, laughing,
non-verbal behavior). We will first make use of the database MatchNMingle (Raman&Hung,19).
Several steps are foreseen:
1. Perform a literature survey on F-formation detection.
2. Extract regions of interest in static images, compute their descriptors, and classify the
situation in terms of F-formation.
3. Extend the model to videos.
4. Perform analysis of social actions such as predicting who will be the next speaker or the social
relationship between interactants.
5. Depending on the project achievements, evaluate the results using virtual agents.
MatchNMingle dataset: http://matchmakers.ewi.tudelft.nl/matchnmingle/pmwiki/index.php?
n=Main.TheDataset
References:
A. Kendon: Conducting interaction: Patterns of behavior in focused encounters. Cambridge University
Press, 1990.
L. Cabrera-Quiros, A. Demetriou, E. Gedik, L. v. d. Meij and H. Hung. The MatchNMingle dataset: a
novel multi-sensor resource for the analysis of social interactions and group dynamics in-the-wild
during free-standing conversations and speed dates. Transactions on Affective Computing, 2018
C. Raman, H. Hung: Towards automatic estimation of conversation floors within F-formations. 8th
International Conference on Affective Computing and Intelligent Interaction Workshops and Demos
(ACIIW), 2019.

Lieu
LIP6
Encadrant
Isabelle Bloch
Co-encadrant
Catherine Pelachaud
Référent universitaire
n/a
Tags
Attribué
Non
Année
2021