Deepseek API Library
Project description
DeepSeek API Library
DeepSeek-V3 delivers groundbreaking improvements in inference speed compared to earlier models. It leads the performance charts among open-source models and competes closely with the most advanced proprietary models available globally.
This Python library provides a lightweight client for seamless communication with the DeepSeek server.
For deepseek GUI support, welcome to check out DeskPai.
📥 Installation
pip install deepseek
🥔 Preparation - DEEPSEEK_API_KEY
You need to obtain a DeepSeek API Key. If you don't have one, visit here to generate it.
You can configure your API key as an environment variable.
On macOS or Linux:
export DEEPSEEK_API_KEY=<YOUR_API_KEY>
On Windows (PowerShell):
setx DEEPSEEK_API_KEY <YOUR_API_KEY>
If DEEPSEEK_API_KEY is not set, you need to manually pass api_key in code DeepSeekAPI(api_key).
💎 Usage
Initialize the API Client
from deepseek import DeepSeekAPI
api_client = DeepSeekAPI()
Retrieve Account Balance
api_client.user_balance()
List Available Models
api_client.get_models()
Chat (Streaming Disabled)
response = api_client.chat_completion(prompt='Hi')
print(response)
Chat (Streaming Enabled)
for chunk in api_client.chat_completion(prompt='Hi', stream=True):
print(chunk, end='', flush=True)
Chat (Multi-turn Mode)
For multi-turn mode, you need to construct prompt as a list with chat history. An example is as below:
prompt = [
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "What is the capital of China?"},
{"role": "assistant", "content": "The capital of China is Beijing."},
{"role": "user", "content": "What is the capital of the United States?"}
]
for chunk in api_client.chat_completion(prompt=prompt, stream=True):
print(chunk, end='', flush=True)
This is another multi-turn chat example in Deskpai Image Chat.
Fill-In-the-Middle (Streaming Disabled)
response = api_client.fim_completion(prompt='Hi', max_tokens=64)
print(response)
Fill-In-the-Middle (Streaming Enabled)
for chunk in api_client.fim_completion(prompt='Once upon a time, ', stream=True):
print(chunk, end='', flush=True)
Customized Model Inference Parameters
use_case = 'Creative Writing'
kwargs = {'max_tokens': 7680, 'temperature': TEMPERATURE_MAP[use_case]}
api_client.chat_completion(prompt='Hi', **kwargs)
🔗 Contact
Maintained by deskpai.com 2025
Contact: dev@deskpai.com
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file deepseek-1.0.0-py3-none-any.whl.
File metadata
- Download URL: deepseek-1.0.0-py3-none-any.whl
- Upload date:
- Size: 4.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.4.2 requests/2.31.0 setuptools/45.2.0 requests-toolbelt/0.8.0 tqdm/4.66.5 CPython/3.8.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ee4175bfcb7ac1154369dbd86a4d8bc1809f6fa20e3e7baa362544567197cb3f
|
|
| MD5 |
ae45ea3c92224ba3c5d7fed78dbd6175
|
|
| BLAKE2b-256 |
047bbede06edf1a25a6ab06553b15f6abf8e912848dfa5f68514720d3e388550
|