Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
/ RAP Public

Register Any Point: Scaling 3D Point Cloud Registration by Flow Matching

License

Notifications You must be signed in to change notification settings

PRBonn/RAP

Repository files navigation

Register Any Point: Scaling 3D Point Cloud Registration by Flow Matching

Yue Pan * Tao Sun * Liyuan Zhu * Lucas Nunes * Iro Armeni * Jens Behley * Cyrill Stachniss

University of Bonn * Stanford University



TODO List

  • Release the inference code and RAP model v1.0.
  • Release RAP model v1.1.
  • Release the training code.
  • Release the training data curation code and example training data.
  • Add evaluation code on public datasets.
  • Release RAP model v1.5 with other feature backbones, allowing metric scale difference, and handling 4D registration.

Abstract

[Details (click to expand)] Point cloud registration aligns multiple unposed point clouds into a common frame, and is a core step for 3D reconstruction and robot localization. In this work, we cast registration as conditional generation: a learned continuous, point-wise velocity field transports noisy points to a registered scene, from which the pose of each view is recovered. Unlike previous methods that conduct correspondence matching to estimate the transformation between a pair of point clouds and then optimize the pairwise transformations to realize multi-view registration, our model directly generates the registered point cloud. With a lightweight local feature extractor and test-time rigidity enforcement, our approach achieves state-of-the-art results on pairwise and multi-view registration benchmarks, particularly with low overlap, and generalizes across scales and sensor modalities. It further supports downstream tasks including relocalization, multi-robot SLAM, and multi-session map merging.

Installation

Clone the repo:

git clone https://github.com/PRBonn/RAP.git
cd RAP

Setup conda environment:

conda create -n py310-rap python=3.10 -y
conda activate py310-rap

Install the dependency:

bash ./scripts/install.sh

Download model and example data:

bash ./scripts/download_weights_and_demo_data.sh

Run RAP

Try the demo by:

python app.py

Run batch inference after modifying the config files and the script test_script_example.sh:

bash ./scripts/test_script_example.sh

Citation

[Details (click to expand)]

If you use RAP for any academic work, please cite:

@article{pan2025arxiv,
title = {{Register Any Point: Scaling 3D Point Cloud Registration by Flow Matching}},
author = {Pan, Yue and Sun, Tao and Zhu, Liyuan and Nunes, Lucas and Armeni, Iro and Behley, Jens and Stachniss, Cyrill},
journal = arxiv,
volume = {arXiv:2512.01850},
year = {2025}
}

Contact

If you have any questions, please contact:

Acknowledgement

[Details (click to expand)]

RAP is built on top of Rectified Point Flow (RPF) and we thank the authors for the following works:

About

Register Any Point: Scaling 3D Point Cloud Registration by Flow Matching

Resources

Readme

License

MIT license

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published