Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning for Wireless Communication and Networks (ML4Wireless)

Simplicity is Key: An Unsupervised Pretraining Approach for Sparse Radio Channels

Jonathan Ott · Maximilian Stahlke · Tobias Feigl · Bjoern Eskofier · Christopher Mutschler

[ ] [ Project Page ]
Fri 18 Jul 2 p.m. PDT — 3 p.m. PDT

Abstract:

We introduce the Sparse pretrained RadioTransformer (SpaRTran), an unsupervised rep-resentation learning approach based on the con-cept of compressed sensing for radio channels.Our approach learns embeddings that focus onthe physical properties of radio propagation, tocreate the optimal basis for fine-tuning on radio-based downstream tasks. SpaRTran uses a sparsegated autoencoder that induces a simplicity bias tothe learned representations, resembling the sparsenature of radio propagation. For signal recon-struction, it learns a dictionary that holds atomicfeatures, which increases flexibility across signalwaveforms and spatiotemporal signal pattern.Our experiments show that SpaRTran reduces er-rors by up to 85 % compared to state-of-the-artmethods when fine-tuned on radio fingerprinting,a challenging downstream task. In addition, ourmethod requires less pretraining effort and offersgreater flexibility, as we train it solely on individ-ual radio signals. SpaRTran serves as an excel-lent base model that can be fine-tuned for variousradio-based downstream tasks, effectively reduc-ing the cost for labeling. And it is significantlymore versatile than existing methods and demon-strates superior generalization.

Chat is not available.