DeepClaude Cuts Claude Code Costs by 95% Using DeepSeek V4 Pro Backend
#Regulation

DeepClaude Cuts Claude Code Costs by 95% Using DeepSeek V4 Pro Backend

Startups Reporter
3 min read

A new open-source proxy lets developers run Claude Code's autonomous agent loop with DeepSeek V4 Pro or other Anthropic-compatible backends, reducing costs from $200/month to as low as $20/month while preserving the full CLI and VS Code experience.

The viral popularity of Claude Code as an autonomous coding agent has hit a practical wall: its $200 monthly subscription with usage caps puts continuous AI-assisted development out of reach for many individual developers and small teams. A new GitHub project called deepclaude aims to solve this by acting as a drop-in proxy that routes Claude Code's API calls to cheaper backends like DeepSeek V4 Pro—claiming 17x lower costs without changing the user experience.

The core insight is simple but effective: Claude Code's power comes from its agent loop (tool use, file editing, bash execution, subagent spawning), not specifically from Anthropic's models. By intercepting API calls and redirecting them to Anthropic-compatible endpoints, deepclaude lets users swap the "brain" while keeping the "body" of Claude Code intact. The setup requires only three steps: obtaining a DeepSeek API key, setting environment variables, and installing a shell script or VS Code terminal profile.

Cost savings are immediate and substantial. According to the project's benchmarks, DeepSeek V4 Pro costs $0.87 per million output tokens versus Anthropic's $15/million—a 17x difference. With DeepSeek's automatic context caching (which reduces repeat-turn costs to $0.004/million), agent loops become exceptionally cheap. The project provides concrete usage scenarios: light users (10 days/month) see costs drop from $200 to $20 (90% savings), while heavy users (25 days/month) save 75% ($50 vs $200). Even with aggressive autonomous looping, costs remain around $80/month versus the capped $200.

Technically, deepclaude runs a local proxy on port 3200 that splits traffic: WebSocket connections for remote control still go to Anthropic's bridge (required for claude.ai/code sessions), but all model API calls route to the selected backend. Users can switch backends mid-session without restarting—either via slash commands (/deepseek, /anthropic) added to Claude Code's command directory, CLI flags (deepclaude --switch ds), or VS Code keyboard shortcuts. The proxy also tracks token usage and displays real-time savings compared to Anthropic pricing.

Supported backends include:

  • DeepSeek (default): $0.44 input / $0.87 output per million tokens, with automatic context caching
  • OpenRouter: Same pricing as DeepSeek but routed through US/EU servers for lower latency
  • Fireworks AI: Faster US-based inference at $1.74 input / $3.48 output per million
  • Anthropic: Original pricing ($3 input / $15 output) for fallback to Opus when needed

The project acknowledges limitations: DeepSeek's Anthropic-compatible endpoint doesn't support image input, and MCP server tools aren't functional through the compatibility layer. However, core agent capabilities—file operations, bash execution, glob/grep search, subagent spawning, and git operations—work identically to standard Claude Code.

Remote control functionality (deepclaude --remote) deserves special note. This feature launches a browser-accessible Claude Code session (via https://claude.ai/code/session_...) where the WebSocket connection still uses Anthropic's infrastructure, but model calls go through the local proxy to DeepSeek or other backends. This enables using the agent loop from phones or tablets while maintaining cost savings—a significant advantage for developers who need to prompt coding changes away from their primary workstation.

As AI coding agents move from novelty to essential tool, cost becomes the primary barrier to adoption. Projects like deepclaude highlight a growing trend: decoupling agent orchestration layers from specific model providers. By making the Claude Code experience accessible at a fraction of the price, it could expand autonomous coding from enterprise teams to indie developers, students, and open-source maintainers who previously couldn't justify the expense. The MIT-licensed project is available now on GitHub, with setup instructions covering Windows PowerShell, macOS/Linux shells, and VS Code integration.

Comments

Loading comments...