Python SDK for NocturnusAI — a logic-based inference engine and knowledge database
Project description
NocturnusAI Python SDK
Python SDK for NocturnusAI — a logic-based inference engine and knowledge database for AI agents.
Install
pip install nocturnusai
pip install "nocturnusai[langchain]" # optional LangChain tools
Start the server
# Recommended — installs CLI, configures LLM provider, starts server
curl -fsSL https://raw.githubusercontent.com/Auctalis/nocturnusai/main/install.sh | bash
nocturnusai setup
# Or bare Docker (logic engine works; context extraction requires LLM config — see below)
docker run -p 9300:9300 ghcr.io/auctalis/nocturnusai:latest
Verify it's running:
curl http://localhost:9300/health
Quick start — facts, rules, and inference
This works immediately with a plain server — no LLM configuration needed.
from nocturnusai import SyncNocturnusAIClient
with SyncNocturnusAIClient("http://localhost:9300") as client:
# Assert facts
client.assert_fact("parent", ["alice", "bob"])
client.assert_fact("parent", ["bob", "charlie"])
# Teach a rule: grandparent(?x, ?z) :- parent(?x, ?y), parent(?y, ?z)
client.assert_rule(
head={"predicate": "grandparent", "args": ["?x", "?z"]},
body=[
{"predicate": "parent", "args": ["?x", "?y"]},
{"predicate": "parent", "args": ["?y", "?z"]},
],
)
# Infer — the engine finds alice is grandparent of charlie
results = client.infer("grandparent", ["?who", "charlie"])
for atom in results:
print(f"{atom.predicate}({', '.join(atom.args)})")
# grandparent(alice, charlie)
# Query — exact pattern match without inference
parents = client.query("parent", ["?x", "?y"])
print(f"{len(parents)} parent facts") # 2 parent facts
# Retract a fact
client.retract("parent", ["bob", "charlie"])
Context window — salience-ranked retrieval
Use the context methods to retrieve the most relevant facts for an agent's working memory. This works with any facts you've asserted — no LLM needed.
from nocturnusai import SyncNocturnusAIClient
with SyncNocturnusAIClient("http://localhost:9300") as client:
client.assert_fact("customer_tier", ["acme_corp", "enterprise"])
client.assert_fact("contract_value", ["acme_corp", "2000000"])
client.assert_fact("issue", ["acme_corp", "sla_credits_blocked"])
# Simple salience-ranked window
ctx = client.context(max_facts=10)
for f in ctx.facts:
print(f"{f.predicate}({', '.join(f.args)})")
# Goal-driven optimization — backward chaining narrows to relevant facts
ctx = client.context(
max_facts=10,
goals=[{"predicate": "customer_tier", "args": ["acme_corp", "?tier"]}],
session_id="ticket-42",
)
print(f"Window has {ctx.window_size} facts out of {ctx.total_available} available")
Context reduction from raw text
Requires LLM. The context extraction workflow uses an LLM to convert raw text into structured facts. You must configure an LLM provider or this will time out / return empty results.
The simplest path: run
nocturnusai setup— it auto-configures Ollama. Or set env vars manually:
- Ollama (local):
LLM_BASE_URL=http://host.docker.internal:11434/v1+LLM_MODEL=granite3.3:8b- OpenAI:
OPENAI_API_KEY=sk-...- Anthropic:
ANTHROPIC_API_KEY=sk-ant-...
# Recommended: use the setup wizard (auto-detects Ollama)
nocturnusai setup
# Or manual Docker with Ollama:
docker run -p 9300:9300 \
-e LLM_BASE_URL=http://host.docker.internal:11434/v1 \
-e LLM_MODEL=granite3.3:8b \
ghcr.io/auctalis/nocturnusai:latest
from nocturnusai import SyncNocturnusAIClient
with SyncNocturnusAIClient("http://localhost:9300") as client:
# Extract facts from raw text, assert them, and get an optimized context window
ctx = client.ingest_and_optimize(
text="""
user: Customer says they are enterprise and blocked on SLA credits.
tool: CRM says account is Acme Corp with a 2M ARR contract.
tool: Billing note says renewal is due next month.
""",
max_facts=12,
session_id="ticket-42",
)
print(f"Extracted {ctx.total_facts_included} facts from raw text")
for entry in ctx.entries:
print(f" {entry.predicate}({', '.join(entry.args)})")
# On the next turn, get only what changed (incremental diff)
diff = client.diff_context(session_id="ticket-42", max_facts=12)
print(f"Diff: +{len(diff.added)} / -{len(diff.removed)}")
# Clean up the session when the conversation ends
client.clear_context_session("ticket-42")
Key context methods
| Method | Endpoint | LLM required? |
|---|---|---|
context() |
POST /memory/context |
No |
diff_context() |
POST /context/diff |
No |
summarize_context() |
POST /context/summary |
No |
clear_context_session() |
POST /context/session/clear |
No |
ingest_and_optimize() |
extract + optimize | Yes |
Named databases
Use the database parameter to isolate data. Call ensure_database() to create the database and its default tenant on the server.
with SyncNocturnusAIClient("http://localhost:9300", database="my-project") as client:
client.ensure_database()
client.assert_fact("status", ["ready"])
LangChain integration
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.langchain import get_nocturnusai_tools
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tools(client)
MCP helper
from nocturnusai.mcp import NocturnusAIMCPClient
async with NocturnusAIMCPClient("http://localhost:9300") as mcp:
await mcp.initialize()
tools = await mcp.list_tools()
result = await mcp.call_tool("context", {"maxFacts": 10, "minSalience": 0.1})
print(result.text)
Docs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nocturnusai-0.3.3.tar.gz.
File metadata
- Download URL: nocturnusai-0.3.3.tar.gz
- Upload date:
- Size: 45.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd54645ec5c03feb9d6f0d68d158debeea290c1914728e8dc039bcd375843b53
|
|
| MD5 |
ad95c121f18ac54ddb9f14cd69b32615
|
|
| BLAKE2b-256 |
44540b4364a24208bf08abbb29785110f72e9b6fc6bc162c17b36618958abed0
|
File details
Details for the file nocturnusai-0.3.3-py3-none-any.whl.
File metadata
- Download URL: nocturnusai-0.3.3-py3-none-any.whl
- Upload date:
- Size: 47.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
41eef830164568602221af174bdc98fa26ea04c8ffe70af7d409950491e19c0f
|
|
| MD5 |
abdc4a2287f95ec70c8be5b1efaceb9f
|
|
| BLAKE2b-256 |
6e2a7025f61a86e0921855dcfdcc03ec608a2b0e87b41c8a947d130d3be8ee1f
|