SLAM with D435i 基于 intel 测试过的ROS环境

4800阅读 0评论2020-04-27 iibull
分类:其他平台

https://github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i

SLAM with RealSense? D435i camera on ROS:

The RealSense? D435i is equipped with a built in IMU. Combined with some powerful open source tools, it's possible to achieve the tasks of mapping and localization.

There are 4 main nodes to the process:

Installation:

The first thing to do is to install the components:
realsense2_camera: Follow the installation guide in: https://github.com/intel-ros/realsense.
imu_filter_madgwick: sudo apt-get install ros-kinetic-imu-filter-madgwick
rtabmap_ros: sudo apt-get install ros-kinetic-rtabmap-ros
robot_localization: sudo apt-get install ros-kinetic-robot-localization

Running:

Hold the camera steady with a clear view and run the following command:

roslaunch realsense2_camera opensource_tracking.launch

Wait a little for the system to fix itself.

Personalize RViz:

The pointcloud and a bunch of arrows, or axes marks, will appear on screen. These axes represent all the different coordinate systems involved. For clarity you could remove most of them.

From the Displays Panel:
TF -> Frames, and then leave out as marked only map and camera_link. The first represents the world coordinate system and the second, the camera.

You may want to watch the on-line video as well:
From the Displays panel:
Image->Image Topic: rewrite to /camera/color/image_raw

startup screen display panel

Start moving around and watch the “camera_link” axes mark moving accordingly, in regards to the “map” axes.
Notice:
The built-in IMU can only keep track for a very short time. Moving or turning too quickly will break the sequence of successful point cloud matches and will result in the system losing track. It could happen that the system will recover immediately if stopped moving but if not, the longer the time passed since the break, the farther away it will drift from the correct position. The odds for recovery get very slim, very quickly. The parameters set in the launch file are most likely not ideal but this is a good starting point for calibrating.

For saving a rosbag file you may use the following command:

rosbag record -O my_bagfile_1.bag /camera/aligned_depth_to_color/camera_info  camera/aligned_depth_to_color/image_raw /camera/color/camera_info /camera/color/image_raw /camera/imu /camera/imu_info /tf_static

To replay a saved rosbag file:

roscore >/dev/null 2>&1 & rosparam set use_sim_time true rosbag play my_bagfile_1.bag --clock
roslaunch realsense2_camera opensource_tracking.launch offline:=true

The process looks like that: process movie

and the resulting point cloud: process movie

While the system is up, you can create a 2D map using:

rosrun map_server map_saver map:=/rtabmap/proj_map –f my_map_1

and a resulting map would look something like that:
final map

You can save the point cloud using:

rosrun pcl_ros pointcloud_to_pcd input:=/rtabmap/cloud_map

Notice: The app will keep on saving pointsclouds.
Use ctrl-C to stop it after it reports saving the 1st file.
The app prints to screen the file names it saves. For example: 1543906154413083.pcd

You can watch the saved point cloud later using:

pcl_viewer 1543906154413083.pcd

[Install using: sudo apt-get install pcl-tools]


上一篇:ubuntu 跑 realsense 环境搭建方法
下一篇:OpenCV 骨架化处理 - 用于机器人路径规划