Python client for DeepSeek API
Project description
DeepSeek Python Client
A feature-rich Python client for interacting with DeepSeek's powerful language models, supporting both synchronous and asynchronous operations.
Installation
pip install deepseek-sdk
Quick Start
from deepseek import DeepSeekClient
# Initialize client
client = DeepSeekClient(api_key="your-api-key")
# Basic chat completion
response = client.chat_completion(
messages=[{"role": "user", "content": "Hello, how are you?"}]
)
print(response.choices[0].message.content)
Features
✅ Synchronous & Async Support
🚀 Streaming Responses
🔧 Customizable Parameters
🛠 Error Handling
⚡️ Context Manager Support
🔁 Retry Mechanisms
Complete Usage Guide
1. Basic Chat Completion
response = client.chat_completion(
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Explain quantum computing in simple terms"}
],
model="deepseek-chat", # default model
temperature=0.8, # control randomness (0-2)
max_tokens=500 # limit response length
)
2. Streaming Responses
Synchronous Streaming:
for chunk in client.stream_response(
messages=[{"role": "user", "content": "Tell me a story about AI"}]
):
content = chunk.choices[0].delta.content or ""
print(content, end="", flush=True)
Async Streaming:
async def stream_response():
async for chunk in client.async_stream_response(
messages=[{"role": "user", "content": "Explain blockchain technology"}]
):
content = chunk.choices[0].delta.content or ""
print(content, end="", flush=True)
# Run in event loop
import asyncio
asyncio.run(stream_response())
3. Advanced Configuration
Custom Client Initialization:
client = DeepSeekClient(
api_key="your-api-key",
base_url="https://api.deepseek.com", # Custom endpoint
default_model="deepseek-reasoner" # Set default model
)
Context Manager for Streams:
with client.stream_response(
messages=[{"role": "user", "content": "Generate Python code for quicksort"}]
) as stream:
for chunk in stream:
print(chunk.choices[0].delta.content or "", end="")
4. Error Handling
try:
response = client.chat_completion(messages=[{"role": "user", "content": "Hello"}])
except DeepSeekAPIError as e:
print(f"API Error: {str(e)}")
except DeepSeekError as e:
print(f"Client Error: {str(e)}")
5. Advanced Parameters
Using All Available Options:
response = client.chat_completion(
messages=[{"role": "user", "content": "Compare Python and JavaScript"}],
model="deepseek-chat",
temperature=1.2,
top_p=0.9,
max_tokens=1000,
presence_penalty=0.5,
frequency_penalty=0.5,
stream=False
)
6. Retry Mechanism
from tenacity import retry, stop_after_attempt, wait_exponential
class RobustClient(DeepSeekClient):
@retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=4, max=10))
def chat_completion(self, *args, **kwargs):
return super().chat_completion(*args, **kwargs)
# Usage
client = RobustClient(api_key="your-api-key")
Inspiration: What Can You Build?
🤖 Intelligent Chatbot
while True:
user_input = input("You: ")
if user_input.lower() == 'exit':
break
response = client.chat_completion(
messages=[{"role": "user", "content": user_input}],
temperature=0.9
)
print(f"AI: {response.choices[0].message.content}")
📝 Content Generation System
def generate_blog_post(topic: str) -> str:
prompt = f"Write a 500-word blog post about {topic} with markdown formatting:"
response = client.chat_completion(
messages=[{"role": "user", "content": prompt}],
temperature=0.7,
max_tokens=1500
)
return response.choices[0].message.content
💻 CLI Interface (using Click)
import click
@click.command()
@click.option('--api-key', required=True, help='Your DeepSeek API key')
@click.option('--prompt', required=True, help='Your query/prompt')
@click.option('--stream', is_flag=True, help='Enable streaming')
def deepseek_cli(api_key, prompt, stream):
client = DeepSeekClient(api_key=api_key)
if stream:
for chunk in client.stream_response([{"role": "user", "content": prompt}]):
click.echo(chunk.choices[0].delta.content or "", nl=False)
else:
response = client.chat_completion([{"role": "user", "content": prompt}])
click.echo(response.choices[0].message.content)
if __name__ == '__main__':
deepseek_cli()
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/awesome-feature) - Commit your changes (
git commit -am 'Add awesome feature') - Push to the branch (
git push origin feature/awesome-feature) - Open a Pull Request
License
MIT License - See LICENSE for details
Support
For issues and feature requests, please open an issue
🚀 Pro Tip: Combine with other libraries like rich for beautiful console output, or fastapi to create AI-powered web services!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file deepseek_sdk-0.1.1.tar.gz.
File metadata
- Download URL: deepseek_sdk-0.1.1.tar.gz
- Upload date:
- Size: 5.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a6a2c29967de619b5856b7c41fe202b13351307d82e1aef72e53567ad8f964a
|
|
| MD5 |
690228f27f02fcdcda9e7226f29f76b9
|
|
| BLAKE2b-256 |
a7d50193445c7859f922593e2830f8434036608fe3aab60e2dfd3af10e57eb42
|
File details
Details for the file deepseek_sdk-0.1.1-py3-none-any.whl.
File metadata
- Download URL: deepseek_sdk-0.1.1-py3-none-any.whl
- Upload date:
- Size: 5.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6743194aa9d995c1be5ddfedbc47f0f349379a65885e538b9ca06cdd65364983
|
|
| MD5 |
4175da3561cb8f8c2139f4f1e4e85e5e
|
|
| BLAKE2b-256 |
b37ebcf01e9087273275276d39b90bd6cb1a2a822b8dc2aefee8c56ce88a3b62
|