Skip to content

[CVPR'24] SNI-SLAM: Semantic Neural Implicit SLAM

Notifications You must be signed in to change notification settings

IRMVLab/SNI-SLAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SNI-SLAM: Semantic Neural Implicit SLAM

Siting Zhu*, Guangming Wang*, Hermann Blum, Jiuming Liu, Liang Song, Marc Pollefeys, Hesheng Wang

CVPR 2024 [Paper] [Suppl]

Demo

Logo

Installation

First you have to make sure that you have all dependencies in place. The simplest way to do so, is to use anaconda.

You can create an anaconda environment called sni. For linux, you need to install libopenexr-dev before creating the environment.

sudo apt-get install libopenexr-dev
conda env create -f environment.yaml
conda activate sni

Run

Replica

  1. Download the data with semantic annotations in google drive and save the data into the ./data/replica folder.
  2. Download the pretrained segmentation network in google drive and save it into the ./seg folder. and you can run SNI-SLAM:
python -W ignore run.py configs/Replica/room1.yaml

The mesh for evaluation is saved as $OUTPUT_FOLDER/mesh/final_mesh_eval_rec_culled.ply

Evaluation

Average Trajectory Error

To evaluate the average trajectory error. Run the command below with the corresponding config file:

# An example for room1 of Replica
python src/tools/eval_ate.py configs/Replica/room1.yaml

Visualizing SNI-SLAM Results

For visualizing the results, we recommend to set mesh_freq: 40 in configs/SNI-SLAM.yaml and run SNI-SLAM from scratch.

After SNI-SLAM is trained, run the following command for visualization.

python visualizer.py configs/Replica/room1.yaml --output output/Replica/room1 --top_view --save_rendering

The result of the visualization will be saved at output/Replica/room1/vis.mp4. The green trajectory indicates the ground truth trajectory, and the red one is the trajectory of SNI-SLAM.

Visualizer Command line arguments

  • --output $OUTPUT_FOLDER output folder (overwrite the output folder in the config file)
  • --top_view set the camera to top view. Otherwise, the camera is set to the first frame of the sequence
  • --save_rendering save rendering video to vis.mp4 in the output folder
  • --no_gt_traj do not show ground truth trajectory

Citing

If you find our code or paper useful, please consider citing:

@inproceedings{zhu2024sni,
  title={Sni-slam: Semantic neural implicit slam},
  author={Zhu, Siting and Wang, Guangming and Blum, Hermann and Liu, Jiuming and Song, Liang and Pollefeys, Marc and Wang, Hesheng},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={21167--21177},
  year={2024}
}

About

[CVPR'24] SNI-SLAM: Semantic Neural Implicit SLAM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages