Skip to main content

LangChain integrations for Stardog

Project description

Stardog LangChain Integration

PyPI version Python 3.10+ License: Apache 2.0

LangChain integration for Stardog Voicebox - enabling natural language querying over your enterprise data using LangChain runnables and tools.

Features

  • LangChain Tools: Ready-to-use tools for LangChain agents
  • LCEL Runnables: Composable runnables for building chains
  • Async & Sync: Full support for both async and synchronous operations

Table of Contents

Requirements

  • Python 3.10+
  • A Stardog Cloud account with a Voicebox application
  • Voicebox API token (process to obtain explained below)
  • uv: a python package manager for development and contributions (uv)

Installation

pip install langchain-stardog

Quick Start

Setup Environment Variables

The simplest way to get started is to set your API token as an environment variable:

export SD_VOICEBOX_API_TOKEN="your-voicebox-api-token"

Getting Your API Token:

  1. Log in to Stardog Cloud
  2. Click on your profile icon and select Manage API Keys.
  3. Create a new application and generate a secret.
  4. Copy the API token and keep it secure.
  5. For more details, see Stardog Voicebox API access.

Optional Environment Variables:

export SD_VOICEBOX_CLIENT_ID="my-app"                      # Client identifier (default: VBX-LANGCHAIN)
export SD_CLOUD_ENDPOINT="https://cloud.stardog.com/api"  # Custom endpoint (optional)

Basic Usage with Tools

Tools are designed for agent workflows and automatically load credentials from environment variables:

from langchain_stardog.voicebox import VoiceboxAskTool

# Tools automatically load credentials from SD_VOICEBOX_API_TOKEN
ask_tool = VoiceboxAskTool()

# Ask a question
result = await ask_tool._arun(question="What flights are delayed?")
print(result["answer"])

Note: Tools only support environment variable initialization to ensure consistent, secure configuration in agent workflows.

Using Runnables in LCEL Chains

Runnables support two initialization patterns:

Pattern 1: Auto-load from Environment (Simple)

from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxAskRunnable

# Automatically loads from SD_VOICEBOX_API_TOKEN
chain = (
    RunnablePassthrough()
    | VoiceboxAskRunnable()
    | (lambda x: f"Answer: {x['answer']}")
)

result = await chain.ainvoke({"question": "Show me airports in Texas"})

Pattern 2: Explicit Client (Advanced)

from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxClient, VoiceboxAskRunnable

# Create client for custom configuration
client = VoiceboxClient(
    api_token="your-token",
    client_id="my-app"
)

chain = (
    RunnablePassthrough()
    | VoiceboxAskRunnable(client=client)
    | (lambda x: f"Answer: {x['answer']}")
)

result = await chain.ainvoke({"question": "Show me airports in Texas"})

Class Reference

The library provides the following main classes:

Runnables (for LCEL chains):

  • VoiceboxSettingsRunnable - Retrieve app settings
  • VoiceboxAskRunnable - Ask questions and get answers
  • VoiceboxGenerateQueryRunnable - Generate SPARQL queries

Tools (for agent integration):

  • VoiceboxSettingsTool
  • VoiceboxAskTool
  • VoiceboxGenerateQueryTool

Client:

  • VoiceboxClient - Core client for Stardog Voicebox API

VoiceboxAskRunnable

Initialization:

# From environment variables (simple)
runnable = VoiceboxAskRunnable()

# With explicit client (advanced)
client = VoiceboxClient.from_env()
runnable = VoiceboxAskRunnable(client=client)

Usage:

# Async
result = await runnable.ainvoke({
    "question": "Your question here",
    "conversation_id": "optional-conv-id"
})

# Sync
result = runnable.invoke({
    "question": "Your question here",
    "conversation_id": "optional-conv-id"
})

Output:

{
    "answer": "Natural language answer",
    "sparql_query": "SELECT ...",
    "interpreted_question": "How Voicebox understood it",
    "conversation_id": "conv-123",
    "message_id": "msg-456"
}

VoiceboxAskTool

Initialization:

Tools only support environment variable initialization for consistent, secure agent workflows:

# Automatically loads from SD_VOICEBOX_API_TOKEN
tool = VoiceboxAskTool()

Usage:

# Async
result = await tool._arun(
    question="Your question",
    conversation_id="optional"  # Optional: for multi-turn conversations
)

# Sync
result = tool._run(
    question="Your question",
    conversation_id="optional"
)

VoiceboxClient

Initialization:

# From environment variables (recommended)
client = VoiceboxClient.from_env()

# With custom configuration
client = VoiceboxClient.from_env(
    client_id="my-app",           # Optional: override default client ID
    endpoint="custom-endpoint"     # Optional: custom API endpoint
)

# Direct initialization (advanced)
client = VoiceboxClient(
    api_token="your-token",           # Required: Voicebox API token
    client_id="my-app",                # Optional: Client identifier (default: VBX-LANGCHAIN)
    endpoint="https://...",            # Optional: API endpoint
    auth_token_override="sso-token"    # Optional: SSO auth token
)

Methods:

  • async_get_settings() / get_settings() - Get Voicebox app settings
  • async_ask(question, conversation_id=None) / ask(...) - Ask a question
  • async_generate_query(question, conversation_id=None) / generate_query(...) - Generate SPARQL query

Examples

Check out the examples/ directory for basic examples on how to use the library:

Development

Setup

Clone the repository and install dependencies:

git clone https://github.com/stardog-union/voicebox-langchain-integration.git
cd voicebox-langchain-integration
make install-dev

Common development commands:

# Run tests
make test

# Run tests with coverage
make test-cov

# Format code
make format

# Type checking
make type-check

# Linting
make lint

# Run all CI checks (format, type-check, lint, test)
make ci

Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes, add tests and run make test to verify
  4. Run make ci to verify all static code quality checks pass
  5. Submit a pull request

General Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_stardog-0.1.0.tar.gz (178.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_stardog-0.1.0-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file langchain_stardog-0.1.0.tar.gz.

File metadata

  • Download URL: langchain_stardog-0.1.0.tar.gz
  • Upload date:
  • Size: 178.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for langchain_stardog-0.1.0.tar.gz
Algorithm Hash digest
SHA256 641638030719b6f9b3d83f6fb61b45b19b68bf5ee84e4926a11de6eaf71a8fe2
MD5 8a3e8cc61f066a2032e2bc7fd39f0517
BLAKE2b-256 1f9bf7a61c17378f57682cbc0264a1e1f23ef12d5ea5a2ae1a2e0be695bd2513

See more details on using hashes here.

File details

Details for the file langchain_stardog-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_stardog-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9cd3c92c7d51428a1c4094a94351823b445ab04c424488dc5777419cda399b2d
MD5 9eb049700317e12a8f40f47ecfaf2ad6
BLAKE2b-256 bbed40dbe61f7697206d8cb1b9fa249a818a921926312027cdbd6aaa2fba9fcd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page