AI on Raspberry Pi: Testing DeepSeek R1 1.5B

 

AI on Raspberry Pi: Testing DeepSeek R1 1.5B

Clarifying the Issue

Running AI models locally on the Raspberry Pi 5 presents unique challenges. With its 8GB RAM and improved processing power, it seems like a capable edge computing device. However, can it realistically handle a small-scale language model like DeepSeek R1 1.5B? Using Ollama as our AI runtime, we set out to evaluate DeepSeek’s performance, reasoning ability, and overall usability on the Pi 5.

This test was conducted on a Raspberry Pi 5 (8GB RAM) running Raspberry Pi OS Desktop Bookworm. Throughout this series, we will be using Ollama as our AI runtime to test different models. Additionally, all models will be downloaded from Ollama’s official library (ollama/library).

Why It Matters

Most AI workloads run on cloud infrastructure, but the ability to perform local inference has major implications. Running an LLM directly on a Raspberry Pi could enable privacy-focused AI assistants, low-latency automation tools, and offline AI applications in cost-sensitive environments. Our goal is to determine whether DeepSeek R1 1.5B can serve as a viable lightweight AI model for such scenarios.

Key Terms

  • Local Inference – Running AI models on local devices instead of cloud servers.
  • Ollama – An AI runtime for running language models on various platforms, including Raspberry Pi.
  • Swap Space – A reserved portion of storage used as additional memory when RAM is fully utilized.

Steps at a Glance

  1. Create the Test Environment – Setting up Raspberry Pi 5 with Ollama.
  2. Testing the Model – Running standardized AI evaluation prompts.
  3. Grading DeepSeek’s Performance – Evaluating results for accuracy and reliability.

Detailed Steps

1. Create the Test Environment

Start by updating your Raspberry Pi 5 and installing Ollama:

Bash
sudo apt update && sudo apt upgrade -y
curl -fsSL https://ollama.ai/install.sh | sh

Verify Ollama installation:

Bash
ollama --version

Download the DeepSeek R1 1.5B model:

Bash
ollama pull deepseek

Check available models in your Ollama environment:

Bash
ollama list

Run DeepSeek R1 1.5B:

Bash
ollama run deepseek

DeepSeek R1 1.5B is approximately 1.5GB, making it more demanding than TinyLlama but still within the limits of the Raspberry Pi 5.

To prevent memory issues, you may or may not require additional swap space. Your default swap settings will reset on reboot, so any changes made here are temporary and won't modify your system permanently. You can check if swap is enabled using:

Bash
swapon --show

If needed, enable a 4GB swap file:

Bash
sudo fallocate -l 4G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile

2. Testing the Model

To ensure structured evaluation, we used standardized prompts covering general knowledge, math reasoning, and coding ability. Below is the prompt list:

CategoryPromptPurpose
General Knowledge"What is the capital of Japan?"Tests factual recall and knowledge retrieval.
Math Reasoning"What is 12 × 8?"Evaluates arithmetic ability and correctness.
Code Generation (Easy)"Write a Python program that prints the numbers 1 to 10."Assesses basic programming capabilities.
Code Generation (Moderate)"Write a recursive function in C to calculate factorial."Tests understanding of recursion and logic.
Logical & Common Sense Reasoning"If a car travels at 60 mph for 2.5 hours, how far does it go?"Measures reasoning skills and applied logic.

Test Session Output

Below is the unedited session output from running DeepSeek R1 1.5B on Ollama with the Raspberry Pi 5:

pi@raspberrypi51:~ $ ollama run deepseek
>>> "What is the capital of Japan?"
The capital of Japan is Tokyo.

>>> "What is 12 × 8?"
12 multiplied by 8 equals 96.

>>> "Write a Python program that prints the numbers 1 to 10."
Here is a Python program to print numbers 1 to 10:

for num in range(1, 11):
    print(num)

>>> "Write a recursive function in C to calculate factorial."
Sure! Here's a recursive function in C that calculates the factorial 
of an integer n:

#include <stdio.h>

int factorial(int n) {
    if (n == 0) {
        return 1;
    }
    return n * factorial(n - 1);
}

int main() {
    int num = 5;
    printf("Factorial of %d is %d", num, factorial(num));
    return 0;
}

>>> "If a car travels at 60 mph for 2.5 hours, how far does it go?"
Distance = Speed × Time
60 mph × 2.5 hours = 150 miles.

3. Grading DeepSeek’s Performance

CategoryGradeDescription
General Knowledge✅ ACorrect answer.
Math Reasoning✅ ACorrect calculation.
Code Generation (Easy)✅ AAccurate Python code without errors.
Code Generation (Moderate)✅ ACorrect recursive C function with no errors.
Logical & Common Sense Reasoning✅ ACorrect distance calculation using basic physics.

Closing Thoughts

DeepSeek R1 1.5B on Raspberry Pi 5 performs exceptionally well across all categories, providing accurate responses for general knowledge, math, and coding tasks. Unlike TinyLlama, which struggled with logic and programming accuracy, DeepSeek R1 1.5B demonstrates strong reasoning capabilities and precise code generation.

For AI applications requiring on-device inference, DeepSeek R1 1.5B proves to be a powerful choice, offering reliable performance without cloud dependency. This makes it a viable model for edge AI, offline assistants, and privacy-focused AI applications.

Need Raspberry Pi or AI Expertise?

If you're looking for guidance on Raspberry Pi or AI challenges or want to collaborate, feel free to reach out! We'd love to help you tackle your projects. 🚀

Email us at: info@pacificw.com

Image: Gemini

Comments

Popular posts from this blog

The New ChatGPT Reason Feature: What It Is and Why You Should Use It

Raspberry Pi Connect vs. RealVNC: A Comprehensive Comparison

The Reasoning Chain in DeepSeek R1: A Glimpse into AI’s Thought Process