Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning for Wireless Communication and Networks (ML4Wireless)

TCNet: A Unified Framework for CSI Feedback Compression Leveraging Language Model as Lossless Compressor

Zijiu Yang · Qianqian Yang

[ ] [ Project Page ]
Fri 18 Jul 11 a.m. PDT — noon PDT

Abstract:

Transformer-based architectures have demonstrated strong capability in capturing global dependencies for CSI feedback, yet their high computational overhead limits practical deployment. In contrast, CNNs are more efficient and excel at extracting local features, but struggle with long-range modeling. To leverage the complementary strengths of both, we propose TCNet, a hybrid framework combining CNNs and a Swin Transformer to achieve accurate reconstruction with reduced complexity.Beyond lossy compression, we further introduce a language model-based lossless coding scheme that significantly improves bit-level efficiency. Unlike conventional fixed-length or entropy-based encoding methods, our approach employs a lightweight language model as a universal probability estimator for variable-length arithmetic coding. To ensure compatibility with communication data, we design an alignment mechanism that maps CSI representations into a token structure suitable for language modeling. This alignment enables our method to generalize to other compression tasks in wireless communications. Experimental results on COST2100 demonstrate that our framework achieves the best NMSE–bit rate trade-offs, highlighting the potential of integrating language modeling with compression task in wireless communications.

Chat is not available.