Multimodal Humor Detection and Social Perception Prediction
The code for this experiment is available here. This code also includes the baseline experiment and the modality contributions analysis (Section 5.1 in our paper).
The code for this experiment is available here. This code reproduces the results of the TrF models in Table 1 of our paper.
The code for this experiment is available here. This experiment primarily uses the SpeechBrain toolkit. You can also modify the experiment to evaluate different segments of the recording, such as the first 10 seconds and last 10 seconds of each sample (Section 5 in our paper).
The trait-based ensemble is implemented in the JupyterNotebook. The trait selection for each group is based on the correlation analysis of the training set (see Figure)
Mehedi Hasan Bijoy, Dejan Porjazovski, Nhan Phan, Guangpu Huang, Tamás Grósz, and Mikko Kurimo. 2024. Multimodal Humor Detection and Social Perception Prediction. In Proceedings of the 5th Multimodal Sentiment Analysis Challenge and Workshop: Social Perception and Humor (MuSe ’24), October 28-November 1, 2024, Melbourne, VIC, Australia. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3689062.3689376
@inproceedings{bijoy2024multimodal,
author = {Bijoy, Mehedi Hasan and Porjazovski, Dejan and Phan, Nhan and Huang, Guangpu and Grósz, Tamás and Kurimo, Mikko},
title = {Multimodal Humor Detection and Social Perception Prediction},
booktitle = {Proceedings of the 5th Multimodal Sentiment Analysis Challenge and Workshop: Social Perception and Humor (MuSe '24)},
year = {2024},
address = {Melbourne, VIC, Australia},
publisher = {ACM, New York, NY, USA},
doi = {10.1145/3689062.3689376},
}