Introduction
In recent times, large language models (LLMs) like Deepseek have gained significant attention due to their
open-source nature and versatility. This blog post chronicles my journey of installing Deepseek locally on a
standard PC, exploring performance metrics, and integrating it into a web application.
NOTE: The majority of this was written by Deepseek based on my prompts and refinements i request. I did change the Installation guide section because Deepseek provided a different method than i used and i didn’t want anyone using code I haven’t tested
Why Install Deepseek Locally?
- Open Source Appeal: Deepseek’s open-source availability made it an attractive choice for
experimentation. - Data Privacy: Running the model locally allows me to keep my data private, avoiding
third-party servers. - Web App Integration: As I develop a web app with LLM features, benchmarking Deepseek on average hardware
was crucial.
Performance Overview
Here are the key performance metrics from testing different models:
- 7B Model:
- GPU Usage: 94%
- VRAM Utilization: 6.5 GB
- Tokens per Second: 79.23 (Blazing Fast)
14B Model:
- GPU Usage: 30-40%
- VRAM Utilization: 7.8 GV
- Tokens per Second: 8.21 (Comparable to Online LLMs)
32B Model:
- GPU Usage: 30-40%
- VRAM Utilization: 8GB
- Tokens per Second: 2.4 (Slower but Usable for Specific Tasks)
Installation Guide on Windows
Before installing Deepseek, ensure that your system supports and has WSL enabled.
Step 1: Open PowerShell as Administrator
- Press
Win + S, typePowerShell, right-click, and select Run as administrator.
Step 2: Install Windows Subsystem for Linux (WSL)
- Run the following command to install WSL:
- This will install WSL and Ubuntu
wsl
Step 3: Start Ubuntu
- After installation, find Ubuntu in your start menu or by searching for it, and launch it, or use the “wsl” command again
Step 4: Update Package Lists and Upgrade Software
Inside the Ubuntu terminal, run:
sudo apt update && sudo apt upgrade
5. Install Ollama
Ollama is a tool that simplifies the process of running Deepseek. The following command downloads and runs an install script. If you are uncomfortable with that, remove the pipe to sh so you can review the script before running it.
curl -L https://ollama.ai/install.sh | sh
6. Download Deepseek Model
Run this command to download Deepseek: There are three versions below for the version I ran for this test other larger versions are available if you have a GPU with more ram. You can also skip directly to the run command and Ollama will pull the model and then run it.
ollama pull deepseek-r1:7b
ollama pull deepseek-r1:14b
ollama pull deepseek-r1:32b
7. Run Deepseek Locally
Now that everything is set up, you can start Deepseek.
ollama run deepseek-r1:7b
ollama run deepseek-r1:14b
ollama run deepseek-r1:32b
Step 8: Working with Ollama API
Ollama includes an API running on port 11434, so you can easily integrate it into a program. For details on the API, see https://github.com/ollama/ollama/blob/main/docs/api.md. This is the API request JSON format.
curl http://localhost:11434/api/generate -d '{
"model": "llama3.2",
"prompt": "Why is the sky blue?"
}'
API Response JSON format.
{
"model": "llama3.2",
"created_at": "2023-08-04T08:52:19.385406455-07:00",
"response": "The",
"done": false
}
Lessons Learned
- Ease of Setup: Surprisingly straightforward process given initial concerns.
- Model Performance: Smaller models like 7B delivered impressive results, exceeding expectations.
Future Directions
Exploring alternative backends such as Running Augmented Generation (RAG) to enhance performance and
functionality is on the horizon.
Conclusion: Benefits of Running Deepseek Locally
Running Deepseek locally offers control over data privacy, enhances learning through hands-on experience, and
provides flexibility in integrating with personal projects. The journey was both enlightening and rewarding,
setting a solid foundation for further explorations in LLMs.
Poems Generated by Deepseek Models
7B Model:
Spring arrives, a fresh embrace,
With tender arms and healing grace.
The world unfolds anew, so bright,
A dance of life, unceasing song.
Blossoms unfurl their fragrant garlands,
Birds return to nest in old lands.
Old walls let go of winter’s chill,
And hearts, within this vibrant spell,
Release the grip of cold and strife,
To welcome spring’s invigorated life.
So let us clasp the tender hand,
Of nature’s healing hand that grows.
For in this season of renewal,
A new beginning takes its course,
A tender dance of life anew,
In every leaf, every hue.
14B Model:
Seashells scattered, fragile and light,
Echoes of time in their ancient white,
Left by the waves on the sandy shore,
Secrets of the deep they restore.
Schools of fish dart, a vibrant dance,
In the ocean’s embrace, they advance.
Dolphins leap, their joy so free,
In the blue expanse, harmony.
Coral reefs thrive in sunlight’s glow,
Colorful gardens beneath the flow.
Sunbeams filter, a jeweled display,
A world beneath, come to stay.
Tides carry memories far and wide,
Emotions and moments, side by side.
In every wave, a story told,
By the ocean’s heart, our souls are mollified.
32B Model:
In grains of sand, time flows its course,
Each particle a moment brief and short.
They drift away with gentle grace,
Yet leave behind an echo in the void.
Once warm and bright, now cooled by years,
The sun’s soft glow remains but faintly near.
Though tides may cleanse the shore anew,
Some memories linger, eternally dear.
Within each grain, a memory rests,
Whispers of laughter, joy, and love.
As time moves on with steadfast pace,
These moments remain in every trace.





Leave a comment