Skip to content

Commit 73831ad

Browse files
committed
add MACVO best paper post
1 parent 5003fd7 commit 73831ad

3 files changed

Lines changed: 37 additions & 0 deletions

File tree

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
---
2+
layout: post
3+
title: "MAC-VO won ICRA 2025 Best Conference Paper Award!"
4+
date: 2025-07-04 10:44:07
5+
categories: highlights
6+
author: "Yuheng Qiu"
7+
published: true
8+
sidebar: false
9+
permalink: /highlight-macvo-bestpaper/
10+
image: /img/posts/2025-07-04-macvo-bestpaper/bestconference.jpg
11+
# hero_image: /img/posts/2025-04-10-rayfronts/rayfronts-teaser.gif
12+
---
13+
14+
We are thrilled to announce that our paper **"[MAC-VO](https://mac-vo.github.io/): Metrics-aware Covariance for Learning-based Stereo Visual Odometry"** has won the **ICRA 2025 Best Conference Paper Award** and **Best Paper Award on Robot Perception**!
15+
16+
This prestigious recognition at the IEEE International Conference on Robotics and Automation (ICRA) 2025 highlights the significant impact of our work in advancing stereo visual odometry through learning-based approaches. The dual awards underscore both the technical excellence and practical relevance of our research in the robotics community.
17+
18+
## About MAC-VO
19+
20+
MAC-VO introduces a novel metrics-aware covariance framework that significantly improves the accuracy and reliability of learning-based stereo visual odometry systems. Our approach addresses key challenges in uncertainty quantification and performance optimization for autonomous navigation applications.
21+
22+
- **Project Website**: [https://mac-vo.github.io](https://mac-vo.github.io)
23+
- **Paper**: Available on the project website
24+
- **Code**: Open-source implementation coming soon
25+
26+
## Conference Highlights
27+
28+
During ICRA 2025, we had the opportunity to showcase our work through live demonstrations, engaging with researchers and industry professionals from around the world. The positive feedback and discussions further validated the importance of our contributions to the field.
29+
30+
Check out our live demonstrations from the conference:
31+
32+
- [ICRA 2025 Demo 1](https://www.linkedin.com/posts/yuheng-qiu-6bb9151b0_icra2025-activity-7329852781106712577-TGBG?utm_source=share&utm_medium=member_desktop&rcm=ACoAADFB4q8BfsD7FeZi2jCntcJlilWdCWaUqNA)
33+
- [ICRA 2025 Demo 2](https://www.linkedin.com/posts/yuheng-qiu-6bb9151b0_icra2025-activity-7330644969084366848-BTDE?utm_source=share&utm_medium=member_desktop&rcm=ACoAADFB4q8BfsD7FeZi2jCntcJlilWdCWaUqNA)
34+
35+
Congratulations to all team members who contributed to this achievement! This award represents the culmination of dedicated research efforts and collaborative work across our team.
36+
37+
We look forward to continuing our research in visual odometry and contributing to the advancement of autonomous robotics technology.
1.42 MB
Loading
1.64 MB
Loading

0 commit comments

Comments
 (0)