A Python SDK for the Lumo API by Starlight
Project description
Lumo SDK
A Python SDK for the Lumo API by Starlight. This SDK provides a simple and intuitive interface for executing tasks using AI models and tools.
Installation
pip install lumo-sdk
Quick Start
from lumo_sdk import LumoClient
# Initialize the client with your API key
client = LumoClient(api_key="your-api-key-here")
# Run a task
response = client.run_task(
task="What is the weather in Berlin?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
tools=["ExaSearchTool", "VisitWebsite"]
)
# Access the final answer
print(response.final_answer)
# Access the steps taken
for step in response.steps:
print(f"LLM Output: {step.llm_output}")
print(f"Tool Calls: {step.tool_calls}")
Features
- ✅ Simple and intuitive API
- ✅ Full type hints for better IDE support
- ✅ Pydantic models for request/response validation
- ✅ Comprehensive error handling
- ✅ Support for conversation history
- ✅ Support for custom tools and agent types
API Reference
LumoClient
The main client class for interacting with the Lumo API.
Initialization
client = LumoClient(api_key="your-api-key", base_url=None)
api_key(required): Your Lumo API keybase_url(optional): Custom base URL (defaults tohttps://api.starlight-search.com)
Methods
stream_task()
Stream task execution and receive events in real-time.
for event in client.stream_task(
task: str,
model: str,
base_url: str,
tools: Optional[List[str]] = None,
max_steps: Optional[int] = None,
agent_type: Optional[str] = None,
history: Optional[List[Message]] = None,
) -> Iterator[Union[StreamTokenEvent, StreamStepEvent, StreamDoneEvent]]
Parameters: Same as run_task()
Yields:
StreamTokenEvent: Individual tokens from the LLM response (hascontentfield)StreamStepEvent: Step information with tool calls and output (hasstepdict)StreamDoneEvent: Signals completion of the task
Example:
for event in client.stream_task(
task="What is the weather?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
tools=["ExaSearchTool"]
):
if isinstance(event, StreamTokenEvent):
print(event.content, end="", flush=True)
elif isinstance(event, StreamStepEvent):
print(f"\nStep: {event.step}")
run_task()
Execute a task using the specified model and tools.
response = client.run_task(
task: str,
model: str,
base_url: str,
tools: Optional[List[str]] = None,
max_steps: Optional[int] = None,
agent_type: Optional[str] = None,
history: Optional[List[Message]] = None,
) -> RunTaskResponse
Parameters:
task(required): The task to executemodel(required): Model ID (e.g.,gpt-4,qwen2.5,gemini-2.0-flash)base_url(required): Base URL for the upstream APItools(optional): Array of tool names to make availablemax_steps(optional): Maximum number of steps to takeagent_type(optional): Type of agent to use (function-callingormcp)history(optional): Array ofMessageobjects containing prior conversation context
Returns: RunTaskResponse object with:
final_answer: The final answer from the task executionsteps: List of steps taken during task execution
Streaming
The SDK also supports streaming responses for real-time token delivery:
from lumo_sdk import LumoClient, StreamTokenEvent, StreamStepEvent, StreamDoneEvent
client = LumoClient(api_key="your-api-key")
for event in client.stream_task(
task="What is the weather in Berlin?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
tools=["ExaSearchTool", "VisitWebsite"]
):
if isinstance(event, StreamTokenEvent):
# Print tokens as they arrive
print(event.content, end="", flush=True)
elif isinstance(event, StreamStepEvent):
# Handle step with tool calls
step = event.step
tool_calls = step.get("tool_calls", [])
for tool_call in tool_calls:
print(f"\nTool: {tool_call.get('name')}")
elif isinstance(event, StreamDoneEvent):
print("\n✅ Done!")
break
Examples
Basic Usage
from lumo_sdk import LumoClient
client = LumoClient(api_key="your-api-key")
response = client.run_task(
task="What is the weather in Berlin?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
tools=["ExaSearchTool", "VisitWebsite"]
)
print(response.final_answer)
With Conversation History
from lumo_sdk import LumoClient, Message, MessageRole
client = LumoClient(api_key="your-api-key")
# Create conversation history
history = [
Message(role=MessageRole.USER, content="Hello!"),
Message(role=MessageRole.ASSISTANT, content="Hi! How can I help you?"),
]
response = client.run_task(
task="What did I just say?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
history=history
)
print(response.final_answer)
With Max Steps
from lumo_sdk import LumoClient
client = LumoClient(api_key="your-api-key")
response = client.run_task(
task="Research the latest AI developments",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions",
tools=["ExaSearchTool"],
max_steps=5
)
print(f"Final answer: {response.final_answer}")
print(f"Number of steps: {len(response.steps)}")
Error Handling
from lumo_sdk import LumoClient
from lumo_sdk.exceptions import (
LumoAuthenticationError,
LumoRateLimitError,
LumoServerError,
LumoAPIError,
)
client = LumoClient(api_key="your-api-key")
try:
response = client.run_task(
task="What is the weather?",
model="gpt-4.1-mini",
base_url="https://api.openai.com/v1/chat/completions"
)
except LumoAuthenticationError:
print("Invalid API key")
except LumoRateLimitError:
print("Rate limit exceeded. Please try again later.")
except LumoServerError:
print("Server error. Please try again later.")
except LumoAPIError as e:
print(f"API error: {e}")
Models
Message
Represents a message in the conversation history.
from lumo_sdk import Message, MessageRole
message = Message(
role=MessageRole.USER,
content="Hello!",
tool_call_id=None, # Optional
tool_calls=None # Optional
)
MessageRole
Enum for message roles:
MessageRole.USERMessageRole.ASSISTANTMessageRole.SYSTEMMessageRole.TOOL
RunTaskResponse
Response from the run_task() method.
response = client.run_task(...)
print(response.final_answer) # str
print(response.steps) # List[Step]
Step
Represents a step in the task execution.
for step in response.steps:
print(step.llm_output) # Optional[str]
print(step.tool_calls) # List[ToolCall]
Error Handling
The SDK provides specific exception types for different error scenarios:
LumoError: Base exception for all SDK errorsLumoAPIError: General API errors (400, etc.)LumoAuthenticationError: Authentication errors (401)LumoRateLimitError: Rate limit errors (429)LumoServerError: Server errors (500+)
Requirements
- Python 3.8+
- requests >= 2.28.0
- pydantic >= 2.0.0
Development
Setup
# Clone the repository
git clone https://github.com/starlight-search/lumo-sdk.git
cd lumo-sdk
# Install in development mode
pip install -e ".[dev]"
Running Tests
pytest
Code Formatting
black lumo_sdk tests
ruff check lumo_sdk tests
License
MIT License - see LICENSE file for details.
Links
Building and Publishing to PyPI
Prerequisites
pip install build twine
Build the Package
python -m build
This will create distribution files in the dist/ directory.
Upload to PyPI
Test PyPI (for testing)
python -m twine upload --repository testpypi dist/*
Production PyPI
python -m twine upload dist/*
You'll need to have PyPI credentials configured. You can use:
~/.pypircfile- Environment variables
- Interactive prompts
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lumo_sdk-0.2.2.tar.gz.
File metadata
- Download URL: lumo_sdk-0.2.2.tar.gz
- Upload date:
- Size: 12.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9ab65f58eaa8c1f831c2e8857b13a612aad8b4f848dbc4f177852d9ccdea9f2f
|
|
| MD5 |
0edb45a496825c621580f8e25428c00e
|
|
| BLAKE2b-256 |
dfa3273f45e62bb9c7f7258b22ad0b1c19a5b2a3fad1ee317ec277474e6bdbc8
|
Provenance
The following attestation bundles were made for lumo_sdk-0.2.2.tar.gz:
Publisher:
CI.yml on StarlightSearch/lumo-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lumo_sdk-0.2.2.tar.gz -
Subject digest:
9ab65f58eaa8c1f831c2e8857b13a612aad8b4f848dbc4f177852d9ccdea9f2f - Sigstore transparency entry: 736623407
- Sigstore integration time:
-
Permalink:
StarlightSearch/lumo-sdk@8c0afd372ca8cc6548e881b43ed7d37b1686a1b5 -
Branch / Tag:
refs/tags/v0.2.2 - Owner: https://github.com/StarlightSearch
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
CI.yml@8c0afd372ca8cc6548e881b43ed7d37b1686a1b5 -
Trigger Event:
push
-
Statement type:
File details
Details for the file lumo_sdk-0.2.2-py3-none-any.whl.
File metadata
- Download URL: lumo_sdk-0.2.2-py3-none-any.whl
- Upload date:
- Size: 9.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7162c15ab234dac2b1c4eeaf1f1bc2ba4ed21d025c91a6acf6f535cadc0fb428
|
|
| MD5 |
e2290bed088318ea5f749ffe329f5929
|
|
| BLAKE2b-256 |
b0fb308663b1478afd7d7e5c1289f81916c0f42a1f98318d8539d4046b435b9f
|
Provenance
The following attestation bundles were made for lumo_sdk-0.2.2-py3-none-any.whl:
Publisher:
CI.yml on StarlightSearch/lumo-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lumo_sdk-0.2.2-py3-none-any.whl -
Subject digest:
7162c15ab234dac2b1c4eeaf1f1bc2ba4ed21d025c91a6acf6f535cadc0fb428 - Sigstore transparency entry: 736623447
- Sigstore integration time:
-
Permalink:
StarlightSearch/lumo-sdk@8c0afd372ca8cc6548e881b43ed7d37b1686a1b5 -
Branch / Tag:
refs/tags/v0.2.2 - Owner: https://github.com/StarlightSearch
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
CI.yml@8c0afd372ca8cc6548e881b43ed7d37b1686a1b5 -
Trigger Event:
push
-
Statement type: