Model Usage Instructions

Model Usage InstructionsJetson Orin seriesImage descriptionParameter DescriptionJetson NanoImage DescriptionParameter DescriptionRaspberry Pi 5Image DescriptionParameter DescriptionModel statisticsReferences

This tutorial mainly introduces some models that can be run in different motherboard image environments and different memory versions, as well as precautions.

Jetson Orin series can use the factory image directly.

Jetson Nano and Raspberry Pi 5 versions need to use the configured image provided in the network disk:

Jetson Nano: Jetson_Nano_AI_Pure.img

Raspberry Pi 5: Pi5_AI_Pure.zip

Jetson Orin series

Image description

Jetson Orin series offline AI large model development tutorial can use the factory image

System information

User name: jetson

User password: yahboom

Hotspot name: ROSMASTER

Hotspot password: 12345678

System environment

LLaVA-Phi3

Parameter Description

Jetson Orin NX 16GB, Jetson Orin NX 8GB, Jetson Orin Nano 8GB Can run models above 7B.

Jetson Orin Nano 4GB Runs small parameter models below 7B.

Although the above conclusions are not completely accurate, they can be used as a reference!

Jetson Nano

Image Description

The offline AI large model development tutorial of Jetson Nano can use the image we configured: Jetson_Nano_AI_Pure.img

System Information

Username: jetson

User password: yahboom

Hotspot name: Jetson_Hot

Hotspot password: 12345678

Motherboard IP address: 192.168.1.11

System Environment

Parameter Description

Jetson nano: Run 4B and below parameter models

Although the above conclusions are not completely accurate, they can be used as a reference!

Raspberry Pi 5

Image Description

The offline AI large model development tutorial of Raspberry Pi 5 can use the image we configured: Pi5_AI_Pure.img

System Information

Username: pi

User Password: yahboom

Hotspot Name: Pi_Hot

Hotspot Password: 12345678

System Environment

Parameter Description

Raspberry Pi 5B (8G RAM): Run 8B and below parameter models

Raspberry Pi 5B (4G RAM): Run 3B and below parameter models

Raspberry Pi 5B (2G RAM): Run 0.5B and below parameter models

Although the above conclusions are not completely accurate, they can be used as a reference!

Model statistics

The following table is based on the common models counted by the Ollama official website model library. Users can find the required models on the Ollama official website.

ModelParametersRaspberry Pi 5B (2G RAM)Raspberry Pi 5B (4G RAM)Raspberry Pi 5B (8G RAM)Run Command
Llama 37BNot supportedNot supportedSupportedollama run llama3:8b
Llama 2 Chinese7BNot supportedNot supportedSupportedollama run llama2-chinese:7b
Qwen20.5BSupportedSupportedSupportedollama run qwen2:0.5b
Qwen21.5BNot supportedSupportedSupportedollama run qwen2:1.5b
Qwen27BNo supportNo supportSupportollama run qwen2:7b
Phi-33.8BNo supportNo supportSupportollama run phi3:3.8b
Phi-22.7BNo supportSupportSupportollama run phi:2.7b
Gemma2BNo supportSupportSupportollama run gemma:2b
Gemma7BNo supportNo supportSupportollama run gemma:7b
LLaVA7BNo supportNo supportSupportollama run llava:7b
WizardLM-27BNo supportNo supportSupportollama run wizardlm2:7b
DeepSeek Coder1.3BNo supportSupportSupportollama run deepseek-coder:1.3b
DeepSeek Coder6.7BNot supportedNot supportedSupportedollama run deepseek-coder:6.7b
Yi6BNot supportedNot supportedSupportedollama run yi:6b

References

Ollama

Official website: https://ollama.com/

GitHub: https://github.com/ollama/ollama