Skip to main content

Docklee AI context infrastructure SDK

Project description

docklee

AI context infrastructure SDK for Python. Company knowledge + persistent memory for any AI agent.

Install

pip install docklee

Quick Start

import asyncio
from docklee import Docklee

async def main():
    async with Docklee(api_key="dk_live_xxxx") as client:

        # Query a knowledge engine
        answer = await client.knowledge.query("eng_xxxx", "What is our refund policy?")
        print(answer.answer)
        print(answer.confidence)

        # Retrieve chunks for your own LLM
        chunks = await client.knowledge.retrieve("eng_xxxx", "pricing tiers")
        for chunk in chunks:
            print(chunk.content)

        # Write to memory
        await client.memory.write("space_xxxx", "User prefers dark mode")

        # Search memory
        results = await client.memory.search("space_xxxx", "user preferences")
        for r in results:
            print(r.content)

        # Unified context — KE + DUM in one call
        context = await client.context.assemble(
            "eng_xxxx",
            "What is the pricing for 50 seats?",
            memory_space_id="space_xxxx",
        )
        print(context.answer)

asyncio.run(main())

OpenAI Integration

from openai import AsyncOpenAI
from docklee.providers import withDocklee

client = withDocklee(
    AsyncOpenAI(api_key="sk-xxxx"),
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",          # company knowledge engine
    memory_space_id="space_xxxx",  # user memory space
    mode="precise",                # "precise" | "guide" | "explore"
    write_to_memory=True,          # auto-save conversations to memory
)

# Use normally - knowledge and memory injected automatically
response = await client.chat.completions.create(
    model="gpt-4o",
    messages=[
        { "role": "system", "content": "You are a helpful assistant." },
        { "role": "user", "content": "What is our refund policy?" }
    ]
)

Universal Tool Support

from docklee.providers import DockleeTools

tools = DockleeTools(
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",          # knowledge engine to search
    memory_space_id="space_xxxx",  # memory space to recall from
)

# Returns tools in the format each LLM expects
tools.for_openai()     # OpenAI function calling format
tools.for_anthropic()  # Anthropic tool use format
tools.for_gemini()     # Gemini function calling format
tools.for_any()        # generic format

# Handle tool calls from the LLM
result = await tools.handle_tool_call(
    "docklee_search_knowledge",
    {"query": "refund policy"}
)

LangChain

pip install docklee[langchain]
from docklee.integrations.langchain import DockleeRetriever, DockleeMemory

retriever = DockleeRetriever(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",  # replaces your existing vector store retriever
)

memory = DockleeMemory(
    api_key="dk_live_xxxx",
    space_id="space_xxxx",  # replaces your existing conversation memory
)

docs = await retriever.ainvoke("What is our pricing?")
history = await memory.aload_memory_variables({"input": "pricing"})

LangGraph

pip install docklee[langgraph]
from docklee.integrations.langgraph import docklee_knowledge_node, docklee_memory_node

# Add as nodes in your graph
graph.add_node("knowledge", docklee_knowledge_node(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",     # searches KE at this node
    output_key="knowledge",   # key written to graph state
))

graph.add_node("memory", docklee_memory_node(
    api_key="dk_live_xxxx",
    space_id="space_xxxx",    # reads and writes DUM at this node
    write_response=True,      # save agent response to memory after each turn
))

Voice Agents

pip install docklee[voice]
from docklee.integrations.pipecat import DockleeContextProcessor

processor = DockleeContextProcessor(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",          # fetch knowledge before each turn
    memory_space_id="space_xxxx",  # inject user memory context
    mode="precise",                # answer mode
)

# Call before passing transcript to your LLM
context = await processor.get_context(transcript)

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docklee-1.0.1.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

docklee-1.0.1-py3-none-any.whl (12.1 kB view details)

Uploaded Python 3

File details

Details for the file docklee-1.0.1.tar.gz.

File metadata

  • Download URL: docklee-1.0.1.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for docklee-1.0.1.tar.gz
Algorithm Hash digest
SHA256 15663af4cea38affc3bfcc0a1e80be8c03e25968666d0d4f9e4e97d99c89c015
MD5 ef57c8768715fac940f70a28b97078fb
BLAKE2b-256 9b6c793ea6db1b2c07faef7653622fbcae3236908eac0019ebf4cbefc9a2dfd1

See more details on using hashes here.

File details

Details for the file docklee-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: docklee-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 12.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for docklee-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5c2aaa2bd9881941316685e7c00406be7641a020225e672c0b81565b644c62d3
MD5 d22c62f337631b2c657417d5a71e438a
BLAKE2b-256 9fc7b01b76a436f1489ba4bcd18e231e4d292cd0f569116f64a3bc0e71770990

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page