Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Actionable Interpretability

Interpretable EEG-to-Image Generation with Semantic Prompts

Arshak Rezvani · Ali Akbari · Kosar Arani · Maryam Mirian · Emad Arasteh · Martin McKeown

[ ] [ Project Page ]
Sat 19 Jul 1 p.m. PDT — 2 p.m. PDT

Abstract:

Decoding visual experience from brain signals offers exciting possibilities for neuroscience and interpretable AI. While EEG is accessible and temporally precise, its limitations in spatial detail hinder image reconstruction. Our model bypasses direct EEG-to-image generation by aligning EEG signals with multilevel semantic captions—ranging from object-level to abstract themes—generated by a large language model. A transformer-based EEG encoder maps brain activity to these captions through contrastive learning. During inference, caption embeddings retrieved via projection heads condition a pretrained latent diffusion model for image generation. This text-mediated framework yields state-of-the-art visual decoding on EEGCVPR dataset, with interpretable alignment to known neurocognitive pathways. Dominant EEG-caption associations reflected importance of different semantic levels extracted from perceived images. Saliency maps and t-SNE projections reveal semantic topography across the scalp. Our model demonstrates how structured semantic mediation enables cognitively-aligned visual decoding from EEG.

Chat is not available.