Jetson-inference environment construction

The Raspberry Pi motherboard series does not currently support this tutorial.

一、Install jetson-inference environment

1.Instructions before use

This tutorial is suitable for building a jetson nano image independently. If you use the YAHBOOM version of the image directly, you can ignore this tutorial.

2.The environment version configuration of this tutorial is as shown in the figure:

image-2023040300001

If you don’t want to build it completely by yourself, you can use the jetson-inference compressed package we provide, pass the compressed package into jetson nano, unzip it, and start directly from "Installing Modules"

3. Start building

3.1 Download required dependencies

3.3 Download the relevant python module

Find the file torch-1.8.0-cp36-cp36m-linux_aarch64.whl from the attachment built by our environment and transfer it to jetson nano

3.4 Make changes to files

Edit jetson-inference/CMakePrebuild.sh. Comment out ./download-models.sh (add a # comment in front) as shown in the figure)image-2023040300002

4.Install model

Method 1: You can perform the following steps

After making a selection, the model will be automatically downloaded to the file path of data/network.

Method 2: You can find the packages required by jetson-inference in the attachments we provide for environment construction, transfer the compressed packages inside to jetson-inference/data/network of jetso nano, and then decompress them. Unzip command

Note:

  1. For decompressing multiple .gz files, use this command: for gz in *.gz; do gunzip $gz; done

  2. For decompressing multiple .tar.gz files, use the following command for tar in *.tar.gz; do tar xvf $tar; done

5. Start compiling

If an error is reported during the process, it means that the source code download is incomplete. please go back to step 3.2 and execute the command git submodule update --init.

 

6.Verify whether the installation is successful

image-2023040300003 Find the corresponding directory and view output.jpg as shown below. The recognition result will be displayed at the top of the picture. image-2023040300004

 

二、Install Mediapipe environment

1.Preparing Files

Transfer the two files bazel and mediapipe-0.8-cp36-cp36m-linux_aarch64.whl in the attachment of the environment setup to the jetson nano

2.Install bazel

Open a terminal and run the following command

Check whether the installation of bazel is complete. If the version number can be printed, the installation is complete.

image-2023040400002

3.Install mediapipe

Open a terminal and run the following command

Verify successful installation

image-2023040400003

 

appendix

Other reference tutorial URLs: 1.https://blog.csdn.net/aal779/article/details/122055432

2.https://github.com/dusty-nv/jetson-inference/blob/master/docs/building-repo-2.md

3.https://blog.csdn.net/weixin_43659725/article/details/120211312