Skip to main content

Python SDK for InfuseAI - Build AI applications with your own knowledge bases

Project description

infuseai

The official Python client for InfuseAI. This SDK empowers you to seamlessly integrate your custom RAG apps and Knowledge Bases into any Python application, including Django, Flask, and FastAPI.

Installation

pip install infuseai

Usage

1. Import and Initialize

Initialize the InfuseClient with your application credentials. You can retrieve these from your App's dashboard in InfuseAI.

from infuseai import InfuseClient

client = InfuseClient(
    client_id="YOUR_CLIENT_ID",  # Your User ID
    app_id="YOUR_APP_ID",        # The specific AI App ID
    api_key="YOUR_API_KEY",      # The API Key for this App
)

2. Query Your App

Send prompts to your AI using the query method. The SDK manages context retrieval (RAG) and LLM inference behind the scenes.

def ask_my_assistant():
    try:
        response = client.query("What does the documentation say about deployment?")
        
        print("Answer:", response.response)
        
        # If your app uses a Knowledge Base, you can inspect the sources used:
        if response.sources:
            print("Sources Used:", [s.name for s in response.sources])
            
    except Exception as e:
        print(f"Failed to query InfuseAI: {e}")

ask_my_assistant()

3. Using with Django

# views.py
from django.http import JsonResponse
from infuseai import InfuseClient

client = InfuseClient(
    client_id="YOUR_CLIENT_ID",
    app_id="YOUR_APP_ID",
    api_key="YOUR_API_KEY",
)

def chat_view(request):
    query = request.POST.get("query", "")
    response = client.query(query)
    return JsonResponse({
        "response": response.response,
        "sources": [{"name": s.name, "content": s.content} for s in response.sources]
    })

4. Using with FastAPI

from fastapi import FastAPI
from infuseai import InfuseClient

app = FastAPI()
client = InfuseClient(
    client_id="YOUR_CLIENT_ID",
    app_id="YOUR_APP_ID",
    api_key="YOUR_API_KEY",
)

@app.post("/chat")
async def chat(query: str):
    response = client.query(query)
    return {"response": response.response, "sources_count": len(response.sources)}

5. Context Manager Support

from infuseai import InfuseClient

with InfuseClient(
    client_id="YOUR_CLIENT_ID",
    app_id="YOUR_APP_ID",
    api_key="YOUR_API_KEY",
) as client:
    response = client.query("Tell me about your product")
    print(response.response)

API Reference

InfuseClient

Constructor

InfuseClient(client_id, app_id, api_key, base_url=None)

Parameters:

Parameter Type Required Description
client_id str Yes Your unique user identifier from the dashboard.
app_id str Yes The ID of the specific app you want to interact with.
api_key str Yes The secret API key for authentication.
base_url str No Optional override for the API endpoint (default: Production).

client.query(text)

Sends a prompt to the InfuseAI backend.

Signature

query(text: str) -> QueryResponse

Returns

A QueryResponse dataclass:

@dataclass
class QueryResponse:
    response: str           # The AI's generated answer
    sources: List[Source]   # List of sources used for RAG
    credits_left: float     # Remaining credits
    credits_used: float     # Credits consumed by this query
    raw: Dict[str, Any]     # Raw API response

Error Handling

The SDK provides specific exceptions for different error scenarios:

from infuseai import (
    InfuseClient,
    InfuseAuthError,
    InfuseCreditsError,
    InfuseAPIError,
    InfuseConfigError,
)

try:
    response = client.query("Hello")
except InfuseAuthError as e:
    print(f"Authentication failed: {e.message}")
except InfuseCreditsError as e:
    print(f"Insufficient credits: {e.message}")
except InfuseAPIError as e:
    print(f"API error ({e.status_code}): {e.message}")
except InfuseConfigError as e:
    print(f"Configuration error: {e.message}")

License

ISC

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

infuseai-1.0.0.tar.gz (7.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

infuseai-1.0.0-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file infuseai-1.0.0.tar.gz.

File metadata

  • Download URL: infuseai-1.0.0.tar.gz
  • Upload date:
  • Size: 7.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for infuseai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 63c1516be67045a3b5b38a6288a2771697f2aa4b673faf4eb0c202433c850b90
MD5 66cf41687f3c1a142026b0af398f4476
BLAKE2b-256 56c9831f0c716f45e35384032b5b6690934f8abf7ae55f8ddb37f3015fb565af

See more details on using hashes here.

File details

Details for the file infuseai-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: infuseai-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 7.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for infuseai-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5db39910ddb72263337e58d97e128fc9f63b6107e9e724b6f6b307f127800d13
MD5 5a5b9effa09ccbda1f3b904e236ee34e
BLAKE2b-256 966a5b5274d32de8650924025e39ab3116aaf282d03190ecfac549fcc6f0556d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page