A free, unofficial client for the DeepSeek API with a Google Gemini-like syntax.
Project description
OpenDeep 🚀
OpenDeep is an elegant, free, and unofficial Python client for the DeepSeek API. It provides a clean, Google Gemini-like syntax while seamlessly handling DeepSeek's underlying security mechanisms (Cloudflare bypass, Proof of Work challenges, and Server-Sent Events).
Whether you need quick text generation or complex reasoning from the R1 model, OpenDeep abstracts away the messy backend logic so you can focus on building.
✨ Key Features
- 💎 Gemini-Like Syntax: Familiar and highly readable API design (
model.generate_content). - 🛡️ Cloudflare Bypass: Uses
curl_cffito impersonate browser TLS fingerprints, preventing blocking. - 🧩 Native POW Solver: WebAssembly-powered challenge solver to effortlessly bypass DeepSeek's rate-limiting/bot protection.
- 📡 Real-Time Streaming: Full support for Server-Sent Events (SSE) parsing, including the newest DeepSeek patch formats.
- 🧠 Reasoner Support (DeepSeek-R1): Seamlessly handles the "thinking" process of the reasoner model.
📦 Installation
Ensure you have Python 3.8+ installed. You can install the required dependencies using pip:
pip install -r requirements.txt
🚀 Quick Start
1. Get your API Key (User Token)
To use OpenDeep, you need your active session token from the browser:
- Log into chat.deepseek.com.
- Open your browser's Developer Tools (F12) -> Application (or Storage) -> Local Storage.
- Find the key named
userTokenand copy its value.
2. Basic Usage (Without Streaming)
If you just want the final answer without any real-time console output or "thinking" logs, set stream=False (this is the default behavior).
import opendeep as genai
genai.configure(api_key="your_userToken_here")
model = genai.GenerativeModel("deepseek-chat") # V3 model
# Waits for the full generation to complete
response = model.generate_content("Explain the theory of relativity in simple terms.")
print("Final Output:")
print(response.text)
3. Real-Time Streaming & Reasoner Output
When using the deepseek-reasoner (R1) model with stream=True, OpenDeep will stream the internal "thinking" process directly to your console in a subtle gray color, followed by the actual answer.
import opendeep as genai
genai.configure(api_key="your_userToken_here")
# Select the reasoning model (R1)
model = genai.GenerativeModel("deepseek-reasoner")
# stream=True enables real-time console output
# It will print the reasoning process (in gray) and then the final answer!
response = model.generate_content("How many R's are in the word strawberry?", stream=True)
# The response object still captures the final text (excluding the thinking process)
print(f"\nCaptured text length: {len(response.text)} characters")
💡 Note on Reasoner and Streams:
- If you use
deepseek-reasonerwithstream=True, you will see the thoughts live in the console. - If you use
deepseek-reasonerwithstream=False, the thoughts are silently discarded during processing, and you only get the final, clean answer inresponse.text.
🏗️ Architecture Under the Hood
models.py: Handles the HTTP session, impersonation, header generation, stream decoding, and payload construction.pow.py: A highly optimized WebAssembly (WASM) bridge that calculates Custom SHA3 hashes for DeepSeek's Proof of Work challenge.config.py: Global state management for authentication and endpoints.
🛠️ Troubleshooting
422 Unprocessable Entity: This usually means youruserTokenis invalid or expired. Get a fresh one from your browser.Cloudflare / Empty Response: Ensurecurl_cffiis properly installed. Standardrequestswill get blocked by Cloudflare.WASM errors: Make sure bothwasmtimeandnumpyare installed to solve the Proof of Work challenges.
📜 Disclaimer
This is an unofficial, reverse-engineered client intended for educational purposes and personal use. DeepSeek may update their internal APIs or protection mechanisms at any time. Use responsibly!
made with hate to corps by @cmpdchtr
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opendeep-0.1.0.tar.gz.
File metadata
- Download URL: opendeep-0.1.0.tar.gz
- Upload date:
- Size: 24.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c2ee955f695fe4a815a17d9091d2c19b1b66bcade6679245892c6794ddd9b4c3
|
|
| MD5 |
8bc4ff573657d9e4940ecce1a2a58af9
|
|
| BLAKE2b-256 |
ea7008605743116e00bd7a37d5cbf7a0457cf75ffca9a412455512deb953d5b2
|
File details
Details for the file opendeep-0.1.0-py3-none-any.whl.
File metadata
- Download URL: opendeep-0.1.0-py3-none-any.whl
- Upload date:
- Size: 22.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
07445d275aac39d535df4d2676b445b8e0891ea7289d2aaa575e9eb9fc355167
|
|
| MD5 |
239859a521839a34bbad1af9f22e922c
|
|
| BLAKE2b-256 |
100b900c16aaa3c5ab09ec8ec8a81f69af50a7f4b648bd4085bb7b9bab3c7bad
|