DeepSeek + MCP Tools: Open Source AI with External Tools
DeepSeek produces some of the most capable open-source AI models available. By connecting DeepSeek models to MCP (Model Context Protocol) servers, you get a powerful, cost-effective AI system with access to external tools. This guide covers the setup.
What Is DeepSeek?
DeepSeek is a Chinese AI company that produces open-weight LLMs competitive with frontier models. Key models include:
- DeepSeek-V3: General-purpose model with strong reasoning
- DeepSeek-Coder: Optimized for programming tasks
- DeepSeek-R1: Reasoning-focused model with chain-of-thought
Models are available through the DeepSeek API or can be run locally via Ollama.
Step 1: Access DeepSeek Models
Option A: DeepSeek API
pip install openai
# DeepSeek API is OpenAI-compatible
export DEEPSEEK_API_KEY=your_key_here
from openai import OpenAI
client = OpenAI(
api_key="your_key_here",
base_url="https://api.deepseek.com/v1"
)
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)
Option B: Local via Ollama
ollama pull deepseek-v2:16b
ollama pull deepseek-coder-v2:16b
Step 2: Install MCP Servers
# Filesystem access
npm install -g @modelcontextprotocol/server-filesystem
# Database access
pip install mcp-server-sqlite
# Web search
npm install -g @modelcontextprotocol/server-brave-search
Step 3: Connect DeepSeek to MCP Tools
Use a framework like LangChain to bridge DeepSeek and MCP:
from langchain_openai import ChatOpenAI
from langchain_mcp_adapters import MCPToolkit
from mcp import StdioServerParameters
# DeepSeek as LangChain LLM (OpenAI-compatible API)
llm = ChatOpenAI(
model="deepseek-chat",
api_key="your_key",
base_url="https://api.deepseek.com/v1"
)
# Connect MCP filesystem server
toolkit = MCPToolkit(
server_params=StdioServerParameters(
command="npx",
args=["-y", "@modelcontextprotocol/server-filesystem", "/workspace"]
)
)
await toolkit.initialize()
tools = toolkit.get_tools()
# Build agent
from langchain.agents import create_tool_calling_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant with file access."),
("human", "{input}"),
("placeholder", "{agent_scratchpad}")
])
agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)
result = executor.invoke({"input": "List Python files in the workspace"})
Step 4: Local Deployment with Full Tool Access
For complete privacy, run DeepSeek locally via Ollama and connect to local MCP servers:
# Start Ollama with DeepSeek
ollama serve &
# All data stays on your machine:
# - Model runs locally
# - MCP servers run locally
# - No external API calls needed
Step 5: Find the Right Tools with XLUXX
pip install xluxx
# Discover MCP servers by capability
curl https://api.xluxx.net/v1/tools?q=code+analysis&sort=trust_score
# Check a specific server
curl https://api.xluxx.net/v1/tools/mcp-server-filesystem
XLUXX trust scores help you choose between competing MCP servers based on security, maintenance, and community validation.
DeepSeek + MCP Use Cases
- Code assistant: DeepSeek-Coder + filesystem MCP for reading and modifying code
- Data analysis: DeepSeek-V3 + SQLite MCP for querying local databases
- Research: DeepSeek-R1 + web search MCP for deep research tasks
- DevOps: DeepSeek-Coder + GitHub MCP for repository management
Cost Comparison
- DeepSeek API: Significantly cheaper than GPT-4o or Claude for similar capability
- Local via Ollama: Zero ongoing cost after hardware investment
- MCP servers: Free and open source
Use XLUXX Trust Layer to find reliable MCP tools: api.xluxx.net

Leave a Reply to Claude Desktop MCP Setup: Install and Configure Tools (2026) | XLUXX Cancel reply