Claude Code Local is the open-source project that finally lets you run Claude Code completely free with local models.
Gemma 4, Qwen 3.5, Llama 3.3 โ all supported.
Run offline.
Run without an API key.
Run without a subscription.
And the experience is properly good.
Let me show you how to set it up.
Video notes + links to the tools ๐
What Is Claude Code Local?
Claude Code Local is an open-source project that makes Claude Code work with local AI models.
Normally, Claude Code uses Anthropic's Claude models (paid API).
Claude Code Local swaps that out for free local models via Ollama.
Why this matters:
- ยฃ0/month instead of subscription costs
- Offline capable โ no internet needed
- Privacy โ nothing leaves your machine
- Unlimited usage โ no rate limits
This is genuinely one of the most useful AI tool releases of 2026.
Supported Local Models
Claude Code Local works with any Ollama-compatible model:
Best Options
- Gemma 4 โ fast, lightweight, good quality
- Qwen 3.5 โ most capable, best quality
- Llama 3.3 โ reliable, well-tested
My Recommendation
Qwen 3.5 for best output quality.
Gemma 4 for fastest local performance.
Llama 3.3 as a balanced middle ground.
You can switch between models any time.
Quick Start Setup
Step 1: Install Ollama
# macOS
curl -fsSL https://ollama.com/install.sh | sh
# Or download from ollama.com
Step 2: Download a Model
ollama pull qwen3.5
# or
ollama pull gemma4
# or
ollama pull llama3.3
Step 3: Install Claude Code Local
Copy the quickstart commands from the project's GitHub.
Paste into your terminal.
Run.
Step 4: Launch
Run the Claude Code Local command.
Select your model.
Start using Claude Code locally.
If you want the basics of Ollama first, my Ollama + Hermes setup guide is a gentle introduction.
Why Run Claude Code Locally?
1. Cost Savings
Claude Code subscription costs ยฃ20-200/month.
Claude Code Local: ยฃ0 forever.
Over a year: ยฃ240-ยฃ2,400 saved.
2. Privacy
Your code never leaves your machine.
Critical for:
- Proprietary codebases
- Client work under NDA
- Regulated industries
- Personal projects
3. Offline Work
Travelling without internet?
Still have full Claude Code functionality.
4. No Rate Limits
Managed Claude Code has token limits.
Local has whatever your hardware can deliver.
Unlimited usage for the cost of electricity.
๐ฅ Want the complete Claude Code Local tutorial?
Inside the AI Profit Boardroom, I've added a full setup video with step-by-step configuration, model comparisons, and performance optimisation. Plus configurations for the best models. 2,800+ members running Claude Code Local productively.
Performance Comparison
Claude Code (Cloud)
- Speed: Very fast
- Quality: Highest
- Cost: ยฃ20-200/month
- Offline: No
Claude Code Local + Qwen 3.5
- Speed: Moderate (depends on hardware)
- Quality: Very good (80-90% of Claude Opus)
- Cost: ยฃ0
- Offline: Yes
Claude Code Local + Gemma 4
- Speed: Fast
- Quality: Good for simpler tasks
- Cost: ยฃ0
- Offline: Yes
For most day-to-day work, local models are surprisingly capable.
For complex long-context work, cloud Claude is still better.
Hardware Requirements
Minimum
- 16GB RAM
- Any modern CPU (M1/M2/M3 Mac ideal)
- 50GB disk space per model
Recommended
- 32GB+ RAM for larger models
- Apple Silicon Mac or NVIDIA GPU
- 100GB+ disk space for multiple models
Apple Silicon Optimisation
M-series Macs run Ollama particularly well.
Unified memory architecture helps with larger models.
M4 Max specifically excellent for Qwen 3.5.
Use Cases for Claude Code Local
Private Client Work
NDA-protected projects.
Local inference keeps everything confidential.
Travel/Flight Work
No internet required.
Full functionality preserved.
Cost-Sensitive Operations
High-volume coding tasks.
Cloud would be prohibitively expensive.
Experimentation
Try approaches without burning API credits.
Unlimited iteration for free.
Learn how I make these videos ๐
Combining Local and Cloud
Best setup: both options available.
Switch based on the task:
- Simple tasks โ local (free, fast)
- Complex reasoning โ cloud (better quality)
- Private code โ local (privacy)
- Time-sensitive โ cloud (faster)
You get maximum flexibility + minimum cost.
Claude Code Local Speaking Feature
One cool feature โ Claude Code Local can speak to you.
Text-to-speech integration lets you have your code read back.
Useful for:
- Accessibility
- Listening while working
- Code review walkthroughs
- Learning unfamiliar codebases
๐ฅ Master Claude Code Local for your workflow
Inside the AI Profit Boardroom, I share performance-optimised configurations for different hardware. Model selection strategies, quantisation choices, context management. Get maximum capability from your local setup.
Integrating With Other Tools
Claude Code Local works with:
- Your existing codebase (any language/framework)
- Git workflows (same as cloud Claude Code)
- Ollama ecosystem (all Ollama models)
- Your IDE (VS Code, JetBrains, etc.)
For AI agent context, see my Claude Code AI SEO breakdown on using Claude Code for content automation.
Claude Code Local: Frequently Asked Questions
Does Claude Code Local have the same features as the cloud version?
Most core features work. Some features tied to cloud services may be limited.
Which model should I pick for coding specifically?
Qwen 3.5 for general coding. Gemma 4 if speed matters more than quality.
Can I run Claude Code Local on Windows?
Yes, via Ollama Windows support. Performance varies with hardware.
How does offline Claude Code Local work?
Ollama runs models locally. Claude Code Local interfaces with Ollama. Everything on your machine.
Can I switch between local and cloud within one project?
Yes. Use local for some tasks, cloud for others. No conflict.
Is the code secure with Claude Code Local?
More secure than cloud โ nothing leaves your machine. Good for sensitive work.
Related Reading
- Ollama + Hermes: Free AI agent setup
- Claude Code AI SEO: Cloud-based content workflows
- Hermes VS OpenClaw: Agent ecosystem comparison
- Claude Opus 4.7 AI SEO: The cloud model side
- Best AI Agent Community: Advanced training
Claude Code Local is the free alternative that makes Claude Code accessible to everyone โ and if you want powerful coding AI without the subscription costs, Claude Code Local is the answer in 2026.