Skip to main content

LangChain integrations for Stardog

Project description

Stardog LangChain Integration

PyPI version Python 3.10+ License: Apache 2.0

LangChain integration for Stardog Voicebox - enabling natural language querying over your enterprise data using LangChain runnables and tools.

Features

  • LangChain Tools: Ready-to-use tools for LangChain agents
  • LCEL Runnables: Composable runnables for building chains
  • Async & Sync: Full support for both async and synchronous operations

Table of Contents

Requirements

  • Python 3.10+
  • A Stardog Cloud account with a Voicebox application
  • Voicebox API token (process to obtain explained below)
  • uv: a python package manager for development and contributions (uv)

Installation

pip install langchain-stardog

Quick Start

Setup Environment Variables

The simplest way to get started is to set your API token as an environment variable:

export SD_VOICEBOX_API_TOKEN="your-voicebox-api-token"

Getting Your API Token:

  1. Log in to Stardog Cloud
  2. Click on your profile icon and select Manage API Keys.
  3. Create a new application and generate a secret.
  4. Copy the API token and keep it secure.
  5. For more details, see Stardog Voicebox API access.

Optional Environment Variables:

export SD_VOICEBOX_CLIENT_ID="my-app"                      # Client identifier (default: VBX-LANGCHAIN)
export SD_CLOUD_ENDPOINT="https://cloud.stardog.com/api"  # Custom endpoint (optional)

Basic Usage with Tools

Tools are designed for agent workflows and automatically load credentials from environment variables:

from langchain_stardog.voicebox import VoiceboxAskTool

# Tools automatically load credentials from SD_VOICEBOX_API_TOKEN
ask_tool = VoiceboxAskTool()

# Ask a question
result = await ask_tool._arun(question="What flights are delayed?")
print(result["answer"])

Note: Tools only support environment variable initialization to ensure consistent, secure configuration in agent workflows.

Using Runnables in LCEL Chains

Runnables support two initialization patterns:

Pattern 1: Auto-load from Environment (Simple)

from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxAskRunnable

# Automatically loads from SD_VOICEBOX_API_TOKEN
chain = (
    RunnablePassthrough()
    | VoiceboxAskRunnable()
    | (lambda x: f"Answer: {x['answer']}")
)

result = await chain.ainvoke({"question": "Show me airports in Texas"})

Pattern 2: Explicit Client (Advanced)

from langchain_core.runnables import RunnablePassthrough
from langchain_stardog.voicebox import VoiceboxClient, VoiceboxAskRunnable

# Create client for custom configuration
client = VoiceboxClient(
    api_token="your-token",
    client_id="my-app"
)

chain = (
    RunnablePassthrough()
    | VoiceboxAskRunnable(client=client)
    | (lambda x: f"Answer: {x['answer']}")
)

result = await chain.ainvoke({"question": "Show me airports in Texas"})

Class Reference

The library provides the following main classes:

Runnables (for LCEL chains):

  • VoiceboxSettingsRunnable - Retrieve app settings
  • VoiceboxAskRunnable - Ask questions and get answers
  • VoiceboxGenerateQueryRunnable - Generate SPARQL queries

Tools (for agent integration):

  • VoiceboxSettingsTool
  • VoiceboxAskTool
  • VoiceboxGenerateQueryTool

Client:

  • VoiceboxClient - Core client for Stardog Voicebox API

VoiceboxAskRunnable

Initialization:

# From environment variables (simple)
runnable = VoiceboxAskRunnable()

# With explicit client (advanced)
client = VoiceboxClient.from_env()
runnable = VoiceboxAskRunnable(client=client)

Usage:

# Async
result = await runnable.ainvoke({
    "question": "Your question here",
    "conversation_id": "optional-conv-id"
})

# Sync
result = runnable.invoke({
    "question": "Your question here",
    "conversation_id": "optional-conv-id"
})

Output:

{
    "answer": "Natural language answer",
    "sparql_query": "SELECT ...",
    "interpreted_question": "How Voicebox understood it",
    "conversation_id": "conv-123",
    "message_id": "msg-456"
}

VoiceboxAskTool

Initialization:

Tools only support environment variable initialization for consistent, secure agent workflows:

# Automatically loads from SD_VOICEBOX_API_TOKEN
tool = VoiceboxAskTool()

Usage:

# Async
result = await tool._arun(
    question="Your question",
    conversation_id="optional"  # Optional: for multi-turn conversations
)

# Sync
result = tool._run(
    question="Your question",
    conversation_id="optional"
)

VoiceboxClient

Initialization:

# From environment variables (recommended)
client = VoiceboxClient.from_env()

# With custom configuration
client = VoiceboxClient.from_env(
    client_id="my-app",           # Optional: override default client ID
    endpoint="custom-endpoint"     # Optional: custom API endpoint
)

# Direct initialization (advanced)
client = VoiceboxClient(
    api_token="your-token",           # Required: Voicebox API token
    client_id="my-app",                # Optional: Client identifier (default: VBX-LANGCHAIN)
    endpoint="https://...",            # Optional: API endpoint
    auth_token_override="sso-token"    # Optional: SSO auth token
)

Methods:

  • async_get_settings() / get_settings() - Get Voicebox app settings
  • async_ask(question, conversation_id=None) / ask(...) - Ask a question
  • async_generate_query(question, conversation_id=None) / generate_query(...) - Generate SPARQL query

Examples

Check out the examples/ directory for basic examples on how to use the library:

Development

Setup

Clone the repository and install dependencies:

git clone https://github.com/stardog-union/stardog-langchain.git
cd stardog-langchain
make install-dev

Common development commands:

# Run tests
make test

# Run tests with coverage
make test-cov

# Format code
make format

# Type checking
make type-check

# Linting
make lint

# Run all CI checks (format, type-check, lint, test)
make ci

Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes, add tests and run make test to verify
  4. Run make ci to verify all static code quality checks pass
  5. Submit a pull request

General Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_stardog-0.1.1.tar.gz (178.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_stardog-0.1.1-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file langchain_stardog-0.1.1.tar.gz.

File metadata

  • Download URL: langchain_stardog-0.1.1.tar.gz
  • Upload date:
  • Size: 178.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for langchain_stardog-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d0c547f593b217da101e5c12198924477186c3e2e0149a9b32ef6ddaf3445e25
MD5 98abe3605dca553629f9a24d95a4a85c
BLAKE2b-256 4a3a67fedf1705696804182ba7c9489d0cba5dec18617ab556a1ebf37a7a056e

See more details on using hashes here.

File details

Details for the file langchain_stardog-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_stardog-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d61707fb3b3d449207a082c772dd11a389ac810041eed607984653e02e2d5ace
MD5 7aa557f1e2a49bc25b80bae659e46beb
BLAKE2b-256 5f38a721a1fa7e47d9eb085e6d6f5559408cbd009c328abf1b851ca3bc9ea377

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page