8. ORB_SLAM2 basics

Official website: http://webdiis.unizar.es/~raulmur/orbslam/

ASL Dataset: https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets

mono Dataset: https://vision.in.tum.de/data/datasets/rgbd-dataset/download

stereo Dataset: http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/machine_hall/MH_01_easy/

orb_slam2_ros: http://wiki.ros.org/orb_slam2_ros

ORB-SLAM: https://github.com/raulmur/ORB_SLAM

ORB-SLAM2: https://github.com/raulmur/ORB_SLAM2

ORB-SLAM3: https://github.com/UZ-SLAMLab/ORB_SLAM3

8.1. Introduction

ORB-SLAM is mainly used for monocular SLAM;

The ORB-SLAM2 version supports three interfaces: monocular, binocular and RGBD;

The ORB-SLAM3 version adds IMU coupling and supports fisheye cameras.

All steps of ORB-SLAM use the ORB features of the image uniformly. The ORB feature is a very fast feature extraction method that is rotation invariant and can use pyramids to build scale invariance. Using unified ORB features helps the SLAM algorithm to be consistent in steps such as feature extraction and tracking, key frame selection, three-dimensional reconstruction, and closed-loop detection. The system is also robust to severe motion and supports wide-baseline closed-loop detection and relocalization, including fully automatic initialization. Since the ORB-SLAM system is a SLAM system based on feature points, it can calculate the camera's trajectory in real time and generate sparse three-dimensional reconstruction results of the scene.

On the basis of ORB-SLAM, ORB-SLAM2 contributes:

  1. The first open source SLAM system for monocular, binocular and RGBD cameras, including loopback, relocation and map reuse.

  2. The results of RGBD show that more accuracy can be obtained by using BA than ICP or minimization based on photometric and depth errors.

  3. By using the far point and near point in binoculars and monocular observation, the binocular results are more accurate than the direct binocular SLAM algorithm.

  4. Light positioning mode can effectively reuse maps.

ORB-SLAM2 includes modules common to all SLAM systems: tracking, mapping, relocalization, and loop closing. . The figure below is the process of ORB-SLAM2.

orbslam_all

8.2. Official case

Open the terminal and enter ORB_SLAM2

8.2.1. Monocular test

image-20220225145646590

The blue frame is the key frame, the green frame is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

After the test is completed, the keyframes are saved to the KeyFrameTrajectory.txt file in the current directory.

8.2.2. Binocular test

image-20220225150328232

The blue frame is the key frame, the green frame is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

After the test is completed, the keyframes are saved to the CameraTrajectory.txt file in the current directory.

8.2.3, RGBD test

Merge the depth data and color image data into rgbd data and save it to the associations.txt file

Back to ORB_SLAM2

test command

image-20220225151139117

8.3, ORB_SLAM2_ROS camera test

The internal parameters of the camera have been modified before the product leaves the factory. If you want to learn how to do this, please refer to the section [8.3.1, Modification of Internal Parameters]. It can be handheld or robot used as a mobile carrier for mobile testing.

If it is held, there is no need to execute the next command, otherwise, it will be executed. (Robot side)

<PI5 needs to open another terminal to enter the same docker container

image-20240408144126098

Start the camera ORB_SLAM2 test (Robot side or virtual machine)

8.3.1. Internal parameter modification

The camera requires the internal parameters of the camera before running ORBSLAM, so the camera must be calibrated first. The specific method can be found in the lesson [02, Astra Camera Calibration].

Start monocular camera

Start calibration node

After calibration, move the [calibrationdata.tar.gz] file to the [home] directory.

After unzipping, open [ost.yaml] in the folder and find the camera internal parameter matrix, for example: the following content.

Camera internal parameter matrix

Modify the data in data to the values corresponding to [astra.yaml] and [astra1.0.yaml] in the [param] folder under the [yahboomcar_slam] function package.

8.3.2. Monocular

image-20220225155006242

When the command is executed, there is only a green box in the [ORB_SLAM2:Map Viewer] interface, and the [ORB_SLAM2:Current Frame] interface is trying to initialize. At this time, slowly move the camera up, down, left, and right to find feature points in the screen and initialize slam. .

image-20220225155453650

As shown in the picture above, when you enter the [SLAM MODE] mode, you must continuously acquire each frame of image to position the camera when running the monocular. If you select the pure positioning mode of [Localization Mode] in the upper left picture, the camera will not be able to find its own position. You have to start over to get the keyframes.

8.3.3. Monocular AR

image-20220225161118946

When the command is executed, there is only one interface and [slam not initialized] is displayed. slam is not initialized. Click the box to the left of [Draw Points] in the left column to display feature points. At this time, slowly move the camera up, down, left, and right to find feature points in the picture and initialize slam.

image-20220225161856504

As shown in the picture above, enter [SLAM ON] mode at this time. Click [Insert Cube] on the screen to insert an AR cube where it is considered to be a plane. And the AR block will always be in a fixed position in the scene.

, not a fixed position on the camera screen. Click [Clear All] to clear.

image-20220225161819953

8.3.4, RGBD

image-20220225163133305

RGBD does not have to continuously acquire each frame of image like running a monocular. If you select the pure positioning mode [Localization Mode] in the upper left picture, you can position the key frame just acquired.