Skip to content

Commit 0de958a

Browse files
committed
added rayfronts arxiv to publications and rayfronts website to research page
1 parent 7212fb7 commit 0de958a

3 files changed

Lines changed: 34 additions & 0 deletions

File tree

_bibliography/references.bib

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,14 @@
1+
@misc{alama2025rayfronts,
2+
title={RayFronts: Open-Set Semantic Ray Frontiers for Online Scene Understanding and Exploration},
3+
author={Omar Alama and Avigyan Bhattacharya and Haoyang He and Seungchan Kim and Yuheng Qiu and Wenshan Wang and Cherie Ho and Nikhil Keetha and Sebastian Scherer},
4+
year={2025},
5+
eprint={2504.06994},
6+
archivePrefix={arXiv},
7+
primaryClass={cs.RO},
8+
url={https://arxiv.org/pdf/2504.06994},
9+
video={https://youtu.be/fFSKUBHx5gA},
10+
abstract={Open-set semantic mapping is crucial for open-world robots. Current mapping approaches either are limited by the depth range or only map beyond-range entities in constrained settings, where overall they fail to combine within-range and beyond-range observations. Furthermore, these methods make a trade-off between fine-grained semantics and efficiency. We introduce RayFronts, a unified representation that enables both dense and beyond-range efficient semantic mapping. RayFronts encodes task-agnostic open-set semantics to both in-range voxels and beyond-range rays encoded at map boundaries, empowering the robot to reduce search volumes significantly and make informed decisions both within & beyond sensory range, while running at 8.84 Hz on an Orin AGX. Benchmarking the within-range semantics shows that RayFronts's fine-grained image encoding provides 1.34x zero-shot 3D semantic segmentation performance while improving throughput by 16.5x. Traditionally, online mapping performance is entangled with other system components, complicating evaluation. We propose a planner-agnostic evaluation framework that captures the utility for online beyond-range search and exploration, and show RayFronts reduces search volume 2.2x more efficiently than the closest online baselines.}
11+
}
112
@misc{baek2025pipe,
213
title={PIPE Planner: Pathwise Information Gain with Map Predictions for Indoor Robot Exploration},
314
author={Seungjae Baek and Brady Moon and Seungchan Kim and Muqing Cao and Cherie Ho and Sebastian Scherer and Jeonghwan Jeon},

_posts/2025-04-10-rayfronts.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
---
2+
layout: post
3+
title: "RayFronts: Open-Set Semantic Ray Frontiers for Online Scene Understanding and Exploration"
4+
date: 2025-04-10 10:33:01
5+
categories: research
6+
description: "RayFronts is a real-time semantic mapping system that enables fine-grained scene understanding both within and beyond the depth perception range allowing robots to localize and effectively limit search volumes."
7+
author: "Seungchan Kim"
8+
published: true
9+
redirect: "https://rayfronts.github.io/"
10+
show_sidebar: false
11+
# slim_content_width: true
12+
permalink: /rayfronts/
13+
image: /img/posts/2025-04-10-rayfronts/rayfronts-teaser.gif
14+
datatable: true
15+
title_image: None
16+
hero_image: /img/posts/2025-04-10-rayfronts/rayfronts-teaser.gif
17+
hero_height: is-large
18+
remove_hero_title: false
19+
menubar_toc: false
20+
tags: Perception, Learning
21+
---
22+
23+
RayFronts is a real-time semantic mapping system that enables fine-grained scene understanding both within and beyond the depth perception range allowing robots to localize and effectively limit search volumes. RayFronts can be queried with open-set images and text within the map and beyond it.
828 KB
Loading

0 commit comments

Comments
 (0)