Poster
in
Workshop: Machine Learning for Wireless Communication and Networks (ML4Wireless)
Maximizing Channel Capacity in Semantic Communication: A Classifier-Based Mutual Information Estimation Approach
Xu Wang · Di Wang · Zheng Shi · Guanghua Yang
Semantic communication shifts the focus from traditional bit-level transmission to conveying meaning and context accurately. While recent methods use variational mutual information (MI) estimators to maximize channel capacity, they often suffer from high variance and unreliable estimates under limited sample conditions. To overcome these issues, we propose a novel MI estimation approach that trains a probabilistic classifier to distinguish true input-output signal pairs from randomly shuffled ones. This framework can improve the training stability and provide reliable guidance for training the semantic encoder. Experimental results on the text transmission task show that our model outperforms state-of-the-art end-to-end semantic communication systems and conventional source-channel coding schemes.