Investigations in Emotion Aware Multimodal Gender Prediction Systems From Social Media Data

Investigations in Emotion Aware Multimodal Gender Prediction Systems From Social Media Data

Abstract:

Gender plays a crucial role in improving the performance quality of personalized systems. Privacy and anonymity allow users to hide their details. Based on the intuition that post contents of male and female users differ, we can predict the gender of the social media account holder via their corresponding posts. These posts can be multimodal (text + image) in nature. We investigate various emotion-assisted multimodal gender prediction models in this article. The developed models use gated recurrent units (GRUs) and ResNets for extracting features from the tweets and the images, respectively. The distribution of emotion categories is related to the gender of the target person. In response to these findings, this article describes the first attempt to use multimodal (image and text posted) information for gender prediction in a multitask setting with emotion recognition as an auxiliary task. The enriched PAN-2018 dataset with gender and emotion labels is used to train gender and emotion networks. Several models were developed to improve the gender prediction task by generating better emotion-aware features. The results show that the proposed multimodal emotion detection model outperforms single-modal (text and image)-based models and state-of-the-art systems on the benchmark PAN-2018 dataset.