Phi-3

Phi-31. Model size2. Performance3. Pull Phi-34. Use Phi-34.1. Run Phi-34.2. Have a conversation4.3. End the conversationReferences

Demo Environment

Development board: Jetson Orin series motherboard

SSD: 128G

Tutorial application scope: Whether the motherboard can run is related to the available memory of the system. The user's own environment and the programs running in the background may cause the model to fail to run.

Motherboard modelRun directly with OllamaRun with Open WebUI
Jetson Orin NX 16GB
Jetson Orin NX 8GB
Jetson Orin Nano 8GB
Jetson Orin Nano 4GB

Phi-3 is a powerful, cost-effective small language model (SLM) from Microsoft that outperforms models of the same and higher sizes in a variety of language, reasoning, encoding, and math benchmarks.

1. Model size

ModelParameters
Phi-3 (Mini)3.8B
Phi-3 (Medium)14B

2. Performance

image-20240624233649583

3. Pull Phi-3

Using the pull command will automatically pull the model from the Ollama model library:

image-20250111150207869

4. Use Phi-3

4.1. Run Phi-3

If the system does not have a running model, the system will automatically pull the Phi-3 3.8B model and run it:

4.2. Have a conversation

The time to reply to the question depends on the hardware configuration, please wait patiently!

image-20250111155545160

4.3. End the conversation

Use the Ctrl+d shortcut key or /bye to end the conversation!

image-20250111155626381

References

Ollama

Official website: https://ollama.com/

GitHub: https://github.com/ollama/ollama

Phi-3

Ollama corresponding model: https://ollama.com/library/phi3