Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Object removal in 3D space is a key technology for immersive applications such as virtual reality (VR), augmented reality (AR), and the metaverse. While recent approaches have attempted to address this task using 2D inpainting techniques, they often suffer from two major limitations: (1) inaccurate geometric restoration in the removed regions, and (2) visual inconsistency across multiple viewpoints. To address these challenges, we propose a novel pipeline built upon the Gaussian Splatting framework. First, we perform geometry-aware inpainting by leveraging a pre-trained point cloud completion model and a coarse-to-fine inference strategy, enabling accurate restoration of unseen 3D structures. Next, we introduce a projection refinement network that improves the appearance of novel-view projections by addressing view-dependent artifacts such as color shifts and texture misalignments. Our method further enhances overall scene consistency through fine-tuning of the original Gaussian Splatting representation using the refined multi-view images. Experimental results show that our method makes geometrically accurate and visually coherent outputs, even in challenging 360° panoramic scenes, significantly outperforming existing methods.