Poster
Spherical-Nested Diffusion Model for Panoramic Image Outpainting
Xiancheng Sun · Senmao Ma · Shengxi Li · Mai Xu · Jingyuan Xia · Lai Jiang · Xin Deng · Jiali Wang
East Exhibition Hall A-B #E-3108
Panoramic image outpainting acts as a pivotal role in immersive content generation, allowing for seamless restoration and completion of panoramic content. Given the fact that the majority of generative outpainting solutions operates on planar images, existing methods for panoramic images address the sphere nature by soft regularisation during the end-to-end learning, which still fails to fully exploit the spherical content. In this paper, we set out the first attempt to impose the sphere nature in the design of diffusion model, such that the panoramic format is intrinsically ensured during the learning procedure, named as spherical-nested diffusion (SpND) model. This is achieved by employing spherical noise in the diffusion process to address the structural prior, together with a newly proposed spherical deformable convolution (SDC) module to intrinsically learn the panoramic knowledge. Upon this, the proposed method is effectively integrated into a pre-trained diffusion model, outperforming existing state-of-the-art methods for panoramic image outpainting. In particular, our SpND method reduces the FID values by more than 50\% against the state-of-the-art PanoDiffusion method. Codes are publicly available at \url{https://github.com/chronos123/SpND}.