AI

Self-Supervised Object Goal Navigation with In-Situ Finetuning

1 Mins read

A household robot should be able to navigate to target locations without requiring users to first annotate everything in their home. Current approaches to this object navigation challenge do not test on real robots and rely on expensive semantically labeled 3D meshes. In this work, our aim is an agent that builds self-supervised models of the world via exploration, the same as a child might. We propose an end-to-end self-supervised embodied agent that leverages exploration to train a semantic segmentation model of 3D objects, and uses those representations to learn an object navigation policy purely from self-labeled 3D meshes. The key insight is that embodied agents can leverage location consistency as a supervision signal — collecting images from different views/angles and applying contrastive learning to fine-tune a semantic segmentation model. In our experiments, we observe that our framework performs better than other self-supervised baselines and competitively with supervised baselines, in both simulation and when deployed in real houses.


Source link

Related posts
AI

Understanding Memorization in Diffusion Models: A Statistical Physics Approach to Manifold-Supported Data

3 Mins read
Generative diffusion models have revolutionized image and video generation, becoming the foundation of state-of-the-art generation software. While these models excel at handling…
AI

Trajectory Flow Matching (TFM): A Simulation-Free Training Algorithm for Neural Differential Equation Models

3 Mins read
In healthcare, time series data is extensively used to track patient metrics like vital signs, lab results, and treatment responses over time….
AI

OpenWebVoyager: Building Multimodal Web Agents via Iterative Real-World Exploration, Feedback and Optimization

3 Mins read
Designing autonomous agents that can navigate complex web environments raises many challenges, in particular when such agents incorporate both textual and visual…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *