LangChain integrations for Stardog
Project description
Stardog LangChain Integration
LangChain integration for Stardog Voicebox - enabling natural language querying over your enterprise data using LangChain runnables and tools.
Features
- LangChain Tools: Ready-to-use tools for LangChain agents
- LCEL Runnables: Composable runnables for building chains
- Async & Sync: Full support for both async and synchronous operations
Table of Contents
- Requirements
- Installation
- Quick Start
- Class Reference
- Examples
- Development
- Contributing
- General Support
Requirements
- Python 3.10+
- A Stardog Cloud account with a Voicebox application
- Voicebox API token (process to obtain explained below)
- uv: a python package manager for development and contributions (uv)
Installation
pip install langchain-stardog
Quick Start
Setup Environment Variables
The simplest way to get started is to set your API token as an environment variable:
export SD_VOICEBOX_API_TOKEN="your-voicebox-api-token"
Getting Your API Token:
- Log in to Stardog Cloud
- Click on your profile icon and select Manage API Keys.
- Create a new application and generate a secret.
- Copy the API token and keep it secure.
- For more details, see Stardog Voicebox API access.
Optional Environment Variables:
export SD_VOICEBOX_CLIENT_ID="my-app" # Client identifier (default: VBX-LANGCHAIN)
export SD_CLOUD_ENDPOINT="https://cloud.stardog.com/api" # Custom endpoint (optional)
Basic Usage with Tools
Tools are designed for agent workflows and automatically load credentials from environment variables:
from langchain_stardog.voicebox import VoiceboxAskTool
# Tools automatically load credentials from SD_VOICEBOX_API_TOKEN
ask_tool = VoiceboxAskTool()
# Ask a question
result = await ask_tool._arun(question="What flights are delayed?")
print(result["answer"])
Note: Tools only support environment variable initialization to ensure consistent, secure configuration in agent workflows.
Using Runnables in LCEL Chains
Runnables support two initialization patterns:
Pattern 1: Auto-load from Environment (Simple)
from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxAskRunnable
# Automatically loads from SD_VOICEBOX_API_TOKEN
chain = (
RunnablePassthrough()
| VoiceboxAskRunnable()
| (lambda x: f"Answer: {x['answer']}")
)
result = await chain.ainvoke({"question": "Show me airports in Texas"})
Pattern 2: Explicit Client (Advanced)
from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxClient, VoiceboxAskRunnable
# Create client for custom configuration
client = VoiceboxClient(
api_token="your-token",
client_id="my-app"
)
chain = (
RunnablePassthrough()
| VoiceboxAskRunnable(client=client)
| (lambda x: f"Answer: {x['answer']}")
)
result = await chain.ainvoke({"question": "Show me airports in Texas"})
Class Reference
The library provides the following main classes:
Runnables (for LCEL chains):
VoiceboxSettingsRunnable- Retrieve app settingsVoiceboxAskRunnable- Ask questions and get answersVoiceboxGenerateQueryRunnable- Generate SPARQL queries
Tools (for agent integration):
VoiceboxSettingsToolVoiceboxAskToolVoiceboxGenerateQueryTool
Client:
VoiceboxClient- Core client for Stardog Voicebox API
VoiceboxAskRunnable
Initialization:
# From environment variables (simple)
runnable = VoiceboxAskRunnable()
# With explicit client (advanced)
client = VoiceboxClient.from_env()
runnable = VoiceboxAskRunnable(client=client)
Usage:
# Async
result = await runnable.ainvoke({
"question": "Your question here",
"conversation_id": "optional-conv-id"
})
# Sync
result = runnable.invoke({
"question": "Your question here",
"conversation_id": "optional-conv-id"
})
Output:
{
"answer": "Natural language answer",
"sparql_query": "SELECT ...",
"interpreted_question": "How Voicebox understood it",
"conversation_id": "conv-123",
"message_id": "msg-456"
}
VoiceboxAskTool
Initialization:
Tools only support environment variable initialization for consistent, secure agent workflows:
# Automatically loads from SD_VOICEBOX_API_TOKEN
tool = VoiceboxAskTool()
Usage:
# Async
result = await tool._arun(
question="Your question",
conversation_id="optional" # Optional: for multi-turn conversations
)
# Sync
result = tool._run(
question="Your question",
conversation_id="optional"
)
VoiceboxClient
Initialization:
# From environment variables (recommended)
client = VoiceboxClient.from_env()
# With custom configuration
client = VoiceboxClient.from_env(
client_id="my-app", # Optional: override default client ID
endpoint="custom-endpoint" # Optional: custom API endpoint
)
# Direct initialization (advanced)
client = VoiceboxClient(
api_token="your-token", # Required: Voicebox API token
client_id="my-app", # Optional: Client identifier (default: VBX-LANGCHAIN)
endpoint="https://...", # Optional: API endpoint
auth_token_override="sso-token" # Optional: SSO auth token
)
Methods:
async_get_settings()/get_settings()- Get Voicebox app settingsasync_ask(question, conversation_id=None)/ask(...)- Ask a questionasync_generate_query(question, conversation_id=None)/generate_query(...)- Generate SPARQL query
Examples
Check out the examples/ directory for basic examples on how to use the library:
direct_tool_usage.py- Direct tool usage and multi-turn conversationsagent_integration.py- Agent integration using AWS Bedrockrunnable_chains.py- LCEL chains with runnables
Development
Setup
Clone the repository and install dependencies:
git clone https://github.com/stardog-union/voicebox-langchain-integration.git
cd voicebox-langchain-integration
make install-dev
Common development commands:
# Run tests
make test
# Run tests with coverage
make test-cov
# Format code
make format
# Type checking
make type-check
# Linting
make lint
# Run all CI checks (format, type-check, lint, test)
make ci
Contributing
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes, add tests and run
make testto verify - Run
make cito verify all static code quality checks pass - Submit a pull request
General Support
- Documentation: Stardog Documentation
- Community: Stardog Community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_stardog-0.1.0.tar.gz.
File metadata
- Download URL: langchain_stardog-0.1.0.tar.gz
- Upload date:
- Size: 178.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
641638030719b6f9b3d83f6fb61b45b19b68bf5ee84e4926a11de6eaf71a8fe2
|
|
| MD5 |
8a3e8cc61f066a2032e2bc7fd39f0517
|
|
| BLAKE2b-256 |
1f9bf7a61c17378f57682cbc0264a1e1f23ef12d5ea5a2ae1a2e0be695bd2513
|
File details
Details for the file langchain_stardog-0.1.0-py3-none-any.whl.
File metadata
- Download URL: langchain_stardog-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9cd3c92c7d51428a1c4094a94351823b445ab04c424488dc5777419cda399b2d
|
|
| MD5 |
9eb049700317e12a8f40f47ecfaf2ad6
|
|
| BLAKE2b-256 |
bbed40dbe61f7697206d8cb1b9fa249a818a921926312027cdbd6aaa2fba9fcd
|