Skip to main content

Official Python SDK for EmergentDB vector database

Project description

EmergentDB Python SDK

Official Python SDK for EmergentDB — a managed vector database for embeddings.

Install

pip install emergentdb

Quick Start

from emergentdb import EmergentDB

db = EmergentDB("emdb_your_api_key")

# Insert a vector
db.insert(1, [0.1, 0.2, ...], metadata={"title": "My document"})

# Search
results = db.search([0.1, 0.2, ...], k=5, include_metadata=True)

# Delete
db.delete(1)

API

EmergentDB(api_key, base_url?, timeout?)

Create a client. API key must start with emdb_.

Param Type Default
base_url str https://api.emergentdb.com
timeout float 30.0

Supports context manager:

with EmergentDB("emdb_your_key") as db:
    db.insert(1, vector)

db.insert(id, vector, metadata?, namespace?)

Insert a single vector. Re-inserting the same ID in the same namespace upserts it.

result = db.insert(1, embedding, metadata={"title": "Doc"}, namespace="production")
# InsertResult(success=True, id=1, namespace="production", upserted=False)

db.batch_insert(vectors, namespace?)

Insert up to 1,000 vectors in one call.

result = db.batch_insert([
    {"id": 1, "vector": [...], "metadata": {"title": "Doc 1"}},
    {"id": 2, "vector": [...], "metadata": {"title": "Doc 2"}},
], namespace="production")
# BatchInsertResult(success=True, ids=[1, 2], count=2, new_count=2, upserted_count=0)

db.batch_insert_all(vectors, namespace?)

Insert any number of vectors — auto-chunks into batches of 1,000.

result = db.batch_insert_all(large_vector_list, namespace="production")

db.search(vector, k?, include_metadata?, namespace?)

Search for similar vectors.

Param Type Default
k int 10
include_metadata bool False
namespace str "default"
results = db.search(query_vector, k=10, include_metadata=True, namespace="production")

for r in results.results:
    print(f"{r.id}: {r.score}{r.metadata.get('title')}")

Scores are distances — lower = more similar.

db.delete(id, namespace?)

Delete a vector by ID.

result = db.delete(1, namespace="production")
# DeleteResult(deleted=True, id=1, namespace="production")

db.list_namespaces()

List all namespaces that have vectors.

namespaces = db.list_namespaces()
# ["default", "production", "staging"]

Namespaces

Namespaces partition your vectors into isolated groups. Created automatically on first insert.

# Insert into different namespaces
db.insert(1, vec, metadata={"title": "Prod doc"}, namespace="production")
db.insert(1, vec, metadata={"title": "Dev doc"}, namespace="development")

# Search is scoped to one namespace
prod = db.search(q, namespace="production")
dev = db.search(q, namespace="development")

Vector IDs are unique per namespace — ID 1 in "production" and ID 1 in "development" are completely separate vectors.

With OpenAI Embeddings

import openai
from emergentdb import EmergentDB

client = openai.OpenAI()
db = EmergentDB("emdb_your_key")

# Generate embedding
resp = client.embeddings.create(
    model="text-embedding-3-small",
    input="How do neural networks learn?"
)

# Store it
db.insert(1, resp.data[0].embedding, metadata={
    "title": "Neural Networks 101",
    "tags": ["ml", "neural-networks"],
})

# Search later
query_resp = client.embeddings.create(
    model="text-embedding-3-small",
    input="What is backpropagation?"
)
results = db.search(query_resp.data[0].embedding, k=5, include_metadata=True)

for r in results.results:
    print(f"{r.score:.4f}{r.metadata.get('title', 'untitled')}")

Error Handling

from emergentdb import EmergentDB, EmergentDBError

try:
    db.insert(1, vector)
except EmergentDBError as e:
    print(e.status_code)  # 401, 402, 400, etc.
    print(e.body)         # Full error response
Status Meaning
400 Invalid request
401 Bad or missing API key
402 Vector capacity exceeded
404 Vector not found
500 Server error

Response Models

All response types are dhi BaseModel classes (Pydantic v2-compatible):

from emergentdb import (
    InsertResult,
    BatchInsertResult,
    SearchResult,
    SearchResponse,
    DeleteResult,
)

# Use like Pydantic models
result = db.insert(1, vector)
print(result.model_dump())
print(result.model_dump_json())

Requirements

  • Python >= 3.8
  • httpx >= 0.24.0
  • dhi >= 1.1.3

QDKV — Metadata Cache

Every EmergentDB account includes 10K QDKV keys free — same emdb_ API key, no new signup.

QDKV is a SIMD-accelerated key-value cache (Redis alternative) for session state, feature flags, rate counters, and anything that needs sub-millisecond reads at the edge.

HTTP API

# SET
curl -X POST https://api.emergentdb.com/qdkv/set \
  -H "Authorization: Bearer emdb_YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{ "key": "session:abc", "value": "{\"userId\":42}", "ttlMs": 3600000 }'

# GET
curl https://api.emergentdb.com/qdkv/get/session:abc \
  -H "Authorization: Bearer emdb_YOUR_API_KEY"
# → { "value": "{\"userId\":42}", "found": true }

# DEL
curl -X DELETE https://api.emergentdb.com/qdkv/del/session:abc \
  -H "Authorization: Bearer emdb_YOUR_API_KEY"

# MGET (batch)
curl -X POST https://api.emergentdb.com/qdkv/mget \
  -H "Authorization: Bearer emdb_YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{ "keys": ["session:abc", "session:xyz"] }'
# → { "values": { "session:abc": "{...}", "session:xyz": null } }

# Stats
curl https://api.emergentdb.com/qdkv/stats \
  -H "Authorization: Bearer emdb_YOUR_API_KEY"
# → { "keyCount": 42, "maxKeys": 10000, "plan": "free", "percentUsed": 0 }

In Python (httpx)

import httpx
import json

KEY = "emdb_YOUR_API_KEY"
BASE = "https://api.emergentdb.com"
headers = {"Authorization": f"Bearer {KEY}", "Content-Type": "application/json"}

with httpx.Client() as client:
    # SET
    client.post(f"{BASE}/qdkv/set", headers=headers, json={
        "key": "user:42:prefs",
        "value": json.dumps({"theme": "dark"}),
        "ttlMs": 86400_000,
    })

    # GET
    r = client.get(f"{BASE}/qdkv/get/user:42:prefs", headers=headers).json()
    if r["found"]:
        prefs = json.loads(r["value"])

    # MGET
    r = client.post(f"{BASE}/qdkv/mget", headers=headers,
                    json={"keys": ["user:42:prefs", "user:99:prefs"]}).json()
    # r["values"] → { "user:42:prefs": "{...}", "user:99:prefs": None }

    # DEL
    client.delete(f"{BASE}/qdkv/del/user:42:prefs", headers=headers)

Pricing

Plan Max Keys Price
Free (all accounts) 10,000 $0/mo
Launch 1,000,000 $29/mo
Scale 10,000,000 $99/mo

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

emergentdb-0.0.12.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

emergentdb-0.0.12-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file emergentdb-0.0.12.tar.gz.

File metadata

  • Download URL: emergentdb-0.0.12.tar.gz
  • Upload date:
  • Size: 14.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for emergentdb-0.0.12.tar.gz
Algorithm Hash digest
SHA256 e8ea706d3d2490a88847f3e1bec72f77435e88b41918d3daf10b235bb2769e27
MD5 b4971bfb7dda23a7834e0f43fb6f8806
BLAKE2b-256 38915bb53fbfe199e50dca215013ffcd486518d5f1f0def2d191d9aa7e5da7c2

See more details on using hashes here.

File details

Details for the file emergentdb-0.0.12-py3-none-any.whl.

File metadata

  • Download URL: emergentdb-0.0.12-py3-none-any.whl
  • Upload date:
  • Size: 6.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for emergentdb-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 98977375724e0f032c64dcd916cdff2f6fc712836d1487245c68b19390e00766
MD5 51bfb4a95eb0ca48d653f488ae770ad6
BLAKE2b-256 9516a6a94b357dc19da8217b51246c0b4371e8c76abc2e26b8c3f215630c2a46

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page