Cross-Lingual Cross-Domain Transfer Learning for Rumor Detection
Eliana Providel, Marcelo Mendoza, Mauricio SolarThis study introduces a novel method that merges propagation-based transfer learning with word embeddings for rumor detection. This approach aims to use data from languages with abundant resources to enhance performance in languages with limited availability of annotated corpora in this task. Furthermore, we augment our rumor detection framework with two supplementary tasks—stance classification and bot detection—to reinforce the primary task of rumor detection. Utilizing our proposed multi-task system, which incorporates cascade learning models, we generate several pre-trained models that are subsequently fine-tuned for rumor detection in English and Spanish. The results show improvements over the baselines, thus empirically validating the efficacy of our proposed approach. A Macro-F1 of 0.783 is achieved for the Spanish language, and a Macro-F1 of 0.945 is achieved for the English language.