RelationField: Relate Anything in Radiance Fields

Sebastian Koch

Bosch Corporate Research / Ulm University

Johanna Wald

Google

Mirco Colosi

Bosch Corporate Research

Narunas Vaskevicius

Bosch Corporate Research

Federico Tombari

Google

Timo Ropinski

Ulm University

arXiv:2412.13652 2024

Abstract

Neural radiance fields are an emerging 3D scene representation and recently even been extended to learn features for scene understanding by distilling open-vocabulary features from vision-language models. However, current method primarily focus on object-centric representations, supporting object segmentation or detection, while understanding semantic relationships between objects remains largely unexplored. To address this gap, we propose RelationField, the first method to extract inter-object relationships directly from neural radiance fields. RelationField represents relationships between objects as pairs of rays within a neural radiance field, effectively extending its formulation to include implicit relationship queries. To teach RelationField complex, open-vocabulary relationships, relationship knowledge is distilled from multi-modal LLMs. To evaluate RelationField, we solve open-vocabulary 3D scene graph generation tasks and relationship-guided instance segmentation, achieving state-of-the-art performance in both tasks.

Bibtex

@preprint{koch2024relationfield,
	title={RelationField: Relate Anything in Radiance Fields},
	author={Koch, Sebastian and Wald, Johanna and Colosi, Mirco and Vaskevicius, Narunas and Hermosilla, Pedro and Tombari, Federico and Ropinski, Timo},
	year={2024},
	journal={arxiv preprint arXiv:arXiv:2412.13652}
}