Poster
Improving Multimodal Learning Balance and Sufficiency through Data Remixing
Xiaoyu Ma · Hao Chen · Yongjian Deng
The modality imbalance problem refers to the phenomenon where, during multimodal joint training, the strong modality tends to suppress the learning of the weak one. However, in our study, we observe that the weak modality can also interfere with the learning of the strong one. To investigate this, we delve into the phenomenon and propose the concept of modality clash.To address the issues, we introduce an adaptive data allocation mechanism called Data Remixing. This method decouples multimodal inputs by evaluating each sample at the sample level and assigning it to the most appropriate modality for training. This ensures more balanced learning across modalities. Additionally, it reassembles unimodal inputs at the batch level to further mitigate cross-modal interference.Through extensive experiments, we demonstrate that our approach performs well on multimodal co-decision tasks, significantly enhancing both unimodal and multimodal representation capabilities.