Skip to content

SARL*: Deep RL based human-aware navigation for mobile robot in crowded indoor environments implemented in ROS.

License

Notifications You must be signed in to change notification settings

LeeKeyu/sarl_star

Repository files navigation

sarl_star

ROS implementation of the paper SARL*: Deep Reinforcement Learning based Human-Aware Navigation for Mobile Robot in Indoor Environments presented in ROBIO 2019. This mobile robot navigation framework is implemented on a Turtlebot2 robot platform with lidar sensors (Hokuyo or RPlidar), integrating SLAM, path planning, pedestrian detection and deep reinforcement learning algorithms.

Video demonstration can be found on Youtube or bilibili.

Introduction

We present an advanced version of the Socially Attentive Reinforcement Learning (SARL) algorithm, namely SARL*, to achieve human-aware navigation in indoor environments. Recently, deep RL has achieved great success in generating human-aware navigation policies. However, there exist some limitations in the real-world implementations: the learned navigation policies are limited to certain distances associated with the training process, and the simplification of the environment neglects obstacles other than humans. In this work, we improve the SARL algorithm by introducing a dynamic local goal setting mechanism and a map-based safe action space to tackle the above problems.

Method Overview

For more details, please refer to the paper.

System Setup

We use the laser scanner Hokuyo UTM-30LX or RPLIDAR-A2 as the sensor and TurtleBot 2 as the robot platform.

Some Experiments

For more details, please refer to the paper.

Code Structure

  • Python-RVO2: Crowd simulator using Optimal Reciprocal Collision Avoidance algorithm.
  • laser_filters: ROS package to filter out unwanted laser scans. (optional)
  • navigation: Modified ROS navigation stack to provide AMCL localization, costmaps and basic path planners. Note that our dynamic local goal setting algorithm is implemented in navigation/dwa_local_planner/src/dwa_planner_ros.cpp. Therefore, if you have installed the original ROS navigation stack before, we suggest that you uninstall it and build the stack in our repository (following the steps in the next part "Build & Install") to make sure that our modification make effect.
  • people: ROS stack to detect and track humans using sensor information.
  • rplidar_ros: ROS package to use ROS with rplidar sensor.
  • sarl_star_ros : Core ROS package to run the SARL* navigation algorithm.
  • turtlebot_apps: ROS stack to use ROS with TurtleBot.

Build & Install

Our codes have been tested in Ubuntu 16.04 with Python 2.7.

  1. Install ROS kinetic.
  2. Create and build a catkin workspace and download the codes into src/:
mkdir -p ~/sarl_ws/src
cd ~/sarl_ws/
catkin_make
source devel/setup.bash
cd src
git clone https://github.com/LeeKeyu/sarl_star.git
  1. Install other dependencies:
sudo apt-get install libbullet-dev
sudo apt-get install libsdl-image1.2-dev
sudo apt-get install libsdl-dev
sudo apt-get install ros-kinetic-bfl
sudo apt-get install ros-kinetic-tf2-sensor-msgs
sudo apt-get install ros-kinetic-turtlebot ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-interactions ros-kinetic-turtlebot-simulator ros-kinetic-kobuki-ftdi ros-kinetic-ar-track-alvar-msgs
pip install empy
pip install configparser
  1. Install Python-RVO2:
cd sarl_star/Python-RVO2/
pip install -r requirements.txt
python setup.py build
python setup.py install
  1. Install CrowdNav (Note that the CrowdNav in this repository are modified from the original SARL implementation):
cd sarl_star/sarl_star_ros/CrowdNav/
pip install -e .
  1. Build the catkin workspace:
cd ~/sarl_ws/
catkin_make
source devel/setup.bash

Start the Navigation

  1. Before starting the navigation, make sure your PC is connected with Turtlebot2 and the lidar sensor (either Hokuyo or RPlidar).
  2. Bringup the turtlebot
roslaunch turtlebot_bringup minimal.launch
  1. Build a map of your environment using gmapping package:

If you're using Hokuyo, run

roslaunch turtlebot_navigation hokuyo_gmapping_movebase.launch 

If you're using RPlidar, run

roslaunch turtlebot_navigation rplidar_gmapping_movebase.launch 

Then push or tele-operate the robot to explore the environment and build a map. You will be able to view the real-time navigation in RVIZ. To save the map, open a new terminal and run:

mkdir -p ~/sarl_ws/src/sarl_star/sarl_star_ros/map
rosrun map_server map_saver -f ~/sarl_ws/src/sarl_star/sarl_star_ros/map/new_map
  1. Start navigation using the SARL* policy:

The "sarl_star_navigation.launch" includes the following parts:

  • Laser type: To specify the laser type: "hokuyo" or "rplidar"
  • Laser driver: To start the laser.
  • Laser filter: To filter the laser scans (in our implementation, the laser scans in 0.3m range of the robot or behind the robot are filtered out to avoid misidentifying the robot's structure as human legs) (optional)
  • Map server: To provide the map. (change the map name to your map)
  • AMCL: To track the pose of the robot in the map.
  • People Detector: To detect and track humans using laser scan information.
  • Move base: To provide the global planner and costmap during navigation.
  • SARL_star Planner : To run the SARL* algorithm and send motion commands to the robot.
  • RVIZ: To visualize the navigation.

To start the SARL* navigation, run

roslaunch sarl_star_ros sarl_star_navigation.launch

You will see the rviz window which shows the robot navigation in real time. Draw a goal pose in the map (following the instructions in ROS RVIZ tutorials), the robot will navigate using the SARL* policy, which will avoid humans (blue ballls) and static obstacles in the map, and dynamically update its local goal (red cubes) according to the global plan. The action commands given to the robot can be seen in rviz as green arrows.

To navigate using the original SARL policy, run:

roslaunch sarl_star_ros sarl_original_navigation.launch

Note that we also add the implementation of map-based safe action space to the original SARL policy to ensure the safety of the robot in the real indoor environment.

  1. About customizing the parameters in the SARL* algorithm:

The RL params can be set in the config files in sarl_star/sarl_star_ros/CrowdNav/crowd_nav/configs/.

The ROS navigation params are generally defined in .xml and .yaml files (e.g., sarl_star/turtlebot_apps/turtlebot_navigation/param/config/odom/local_costmap_params.yaml). To find the exact config files and change the values of params, please look at the launch files used.

The distance "d" in the local goal setting method is 5m by default. It can be customized in sarl_star/navigation/base_local_planner/src/goal_functions.cpp by changing the value of "dist_threshold" to "d/2", given your desired "d".

Citation

If you find our work useful in your research, please consider citing our paper:

@INPROCEEDINGS{8961764,  
author={Li, Keyu and Xu, Yangxin and Wang, Jiankun and Meng, Max Q.-H.},  
booktitle={2019 IEEE International Conference on Robotics and Biomimetics (ROBIO)},   
title={SARL∗: Deep Reinforcement Learning based Human-Aware Navigation for Mobile Robot in Indoor Environments},   
year={2019},  
volume={},  
number={},  
pages={688-694},  
doi={10.1109/ROBIO49542.2019.8961764}}

Questions

If you have any questions, please contact "[email protected]".

About

SARL*: Deep RL based human-aware navigation for mobile robot in crowded indoor environments implemented in ROS.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published