5、KCF object tracking

5.1、Introduction

Website:https://learnopencv.com/object-tracking-using-opencv-cpp-python/#opencv-tracking-api

AlgorithmSpeedAccuracyDescription
BOOSTINGSlowLowIt is the same as the machine learning algorithm behind Haar casades (AdaBoost),
but it has been born for more than ten years, a veteran algorithm.
MILSlowLowIt is more accurate than BOOSTING, but the failure rate is higher.
KCFFastHighFaster than BOOSTING and MIL, but it is not effective when there is occlusion
TLDMiddleMiddleThere are a lot of erro
MEDIANFLOWMiddle+MiddleThe model will fail for fast-jumping or fast-moving objects.
GOTURNMiddleMiddleA deep learning-based object detector requires additional models to run.
MOSSEFastestHighThe speed is really fast, but not as high as the accuracy of CSRT and KCF. If you are looking for speed, you can choose it.
CSRTFast -HigherSlightly more accurate than KCF, but not as fast as KCF.

5.2. Operation steps

Note: [R2] on the remote control handle has the [pause/start] function for all gameplays.

5.2.1. Start

jetson motherboard/Raspberry Pi 4B

Start the underlying driver control (robot side)

Raspberry Pi 5

Before running, please confirm that the large program has been permanently closed

Enter docker

Note: If there is a terminal that automatically starts docker, or there is a docker terminal that has been opened, you can directly enter the docker terminal to run the command, and there is no need to manually start docker

Start docker manually

Start the underlying driver control (robot side)

method one

jetson motherboard/Raspberry Pi 4B

Start the monocular camera (robot side)

Start monocular target tracking control (virtual machine)

Raspberry Pi 5

Enter the same docker from multiple terminals

Keep the program of the previous docker terminal running and open a new terminal

Enter the following command

Enter the same docker and use the following 18870bc3dc00 to modify the ID displayed on the actual terminal.

Start the monocular camera (robot side)

Start monocular target tracking control (virtual machine)

Method 2

Note: [q] key to exit.

jetson motherboard/Raspberry Pi 4B

Start monocular target tracking control (robot side)

Raspberry Pi 5

Enter the same docker from multiple terminals

Keep the program of the previous docker terminal running and open a new terminal

Enter the following command

Enter the same docker and use the following 18870bc3dc00 to modify the ID displayed on the actual terminal.

Start monocular target tracking control (robot side)

This method can only be started in the main control connected to the camera

Set parameters according to needs, or modify the launch file directly, so there is no need to attach parameters when starting.

5.2.2. Identification

After starting, enter the selection mode, use the mouse to select the target location, as shown in the figure below, release it to start recognition.

image-20210913174837340

Keyboard key control:

[r]: Select mode, you can use the mouse to select the area to identify the target, as shown in the picture above. If the robotic arm blocks the camera, you can press the [r] key to reset the robotic arm.

[f]: Switching algorithm; ['BOOSTING', 'MIL', 'KCF', 'TLD', 'MEDIANFLOW', 'MOSSE', 'CSRT'].

[q]: Exit the program.

[Spacebar]: Target tracking; just move the target slowly while following. If you move too fast, you will lose the target.

5.2.3. Target tracking

After the recognition is correct, click the [space bar] on the keyboard to execute the object following program.

jetson motherboard/Raspberry Pi

Raspberry Pi 5

Enter the same docker from multiple terminals

Keep the program of the previous docker terminal running and open a new terminal

Enter the following command

Enter the same docker and use the following 18870bc3dc00 to modify the ID displayed on the actual terminal.

image-20210910161042829

Subscribe to image topics; publish topics on gimbal servo, robotic arm, and chassis drive

image-20210910161428484

Publish the topic of gimbal servo, publish the topic of chassis driver and robotic arm.