Skip to main content

Docklee AI context infrastructure SDK

Project description

docklee

AI context infrastructure SDK for Python. Company knowledge + persistent memory for any AI agent.

Install

pip install docklee

Quick Start

import asyncio
from docklee import Docklee

async def main():
    async with Docklee(api_key="dk_live_xxxx") as client:

        # Query a knowledge engine — grounded answers with citations
        answer = await client.knowledge.query("eng_xxxx", "What is our refund policy?")
        print(answer.answer)
        print(answer.confidence)

        # Retrieve chunks for your own LLM
        chunks = await client.knowledge.retrieve("eng_xxxx", "pricing tiers")
        for chunk in chunks:
            print(chunk.content)

        # Write to memory
        await client.memory.write("space_xxxx", "User prefers dark mode")

        # Search memory
        results = await client.memory.search("space_xxxx", "user preferences")
        for r in results:
            print(r.content)

        # Unified context — KE + DUM in one call
        context = await client.context.assemble(
            "eng_xxxx",
            "What is the pricing for 50 seats?",
            memory_space_id="space_xxxx",
        )
        print(context.answer)
        print(f"Memory used: {len(context.memory_context)} records")

asyncio.run(main())

OpenAI Wrapper — 2 lines to add Docklee to any existing app

from openai import AsyncOpenAI
from docklee.providers import withDocklee

client = withDocklee(
    AsyncOpenAI(api_key="sk-xxxx"),
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",         # company knowledge
    memory_space_id="space_xxxx", # user memory
)

# All existing OpenAI code works unchanged
response = await client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "What is our refund policy?"}],
)
# Answer is grounded in your company knowledge + user memory injected automatically

Universal Tool Definition

from docklee.providers import DockleeTools

tools = DockleeTools(
    docklee_key="dk_live_xxxx",
    engine_id="eng_xxxx",
    memory_space_id="space_xxxx",
)

# Works with any LLM
tools.for_openai()     # OpenAI function calling format
tools.for_anthropic()  # Anthropic tool use format
tools.for_gemini()     # Gemini function calling format
tools.for_any()        # Generic format

# Handle tool calls
result = await tools.handle_tool_call("docklee_search_knowledge", {"query": "refund policy"})

LangChain Integration

pip install docklee[langchain]
from docklee.integrations.langchain import DockleeRetriever, DockleeMemory

retriever = DockleeRetriever(api_key="dk_live_xxxx", engine_id="eng_xxxx")
memory = DockleeMemory(api_key="dk_live_xxxx", space_id="space_xxxx")

docs = await retriever.ainvoke("What is our pricing?")
history = await memory.aload_memory_variables({"input": "pricing"})

LangGraph Integration

pip install docklee[langgraph]
from docklee.integrations.langgraph import docklee_knowledge_node, docklee_memory_node

graph.add_node("knowledge", docklee_knowledge_node(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",
))
graph.add_node("memory", docklee_memory_node(
    api_key="dk_live_xxxx",
    space_id="space_xxxx",
))

Voice Agent (Pipecat)

pip install docklee[voice]
from docklee.integrations.pipecat import DockleeContextProcessor

processor = DockleeContextProcessor(
    api_key="dk_live_xxxx",
    engine_id="eng_xxxx",
    memory_space_id="space_xxxx",
)

context = await processor.get_context(transcript)

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docklee-1.0.0.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

docklee-1.0.0-py3-none-any.whl (11.9 kB view details)

Uploaded Python 3

File details

Details for the file docklee-1.0.0.tar.gz.

File metadata

  • Download URL: docklee-1.0.0.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for docklee-1.0.0.tar.gz
Algorithm Hash digest
SHA256 3267b1fcb70f1ae010c2e42b4d23461649bbea003707fe12c1d9d6c9c5029a00
MD5 1d836161c116d5a0721a2d52e1a45bd9
BLAKE2b-256 a94209a16d67a6af9cf1cdea3227d0635f91284809dea60ac14f8e556c412a10

See more details on using hashes here.

File details

Details for the file docklee-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: docklee-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 11.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for docklee-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fea9ca254c9ef5170805f5ea3ece292291b34548a72382ccab4f63dab49a6d8e
MD5 b2350ac8da3fa21c26b2bbbfa9e25d6b
BLAKE2b-256 82d0ef176f1a452db85719a808536f98ec378ddeedacdfffc09f42a6974aa572

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page