Skip to main content

The World's Fastest AI Agent Framework. Based on SBERT & SpaCy Transforms.

Project description

ChatsAPI

The World's Fastest AI Agent Framework.
Based on SBERT and SpaCy Transforms, ChatsAPI is designed to enable seamless natural language processing for AI-powered conversational agents. With hybrid search capabilities and an extensible architecture, ChatsAPI offers blazing-fast performance and intuitive route management.

ChatsAPI-Banner.png

Features

  • SBERT & SpaCy-Based NLP: Combines the power of Sentence-BERT embeddings and SpaCy for intelligent semantic matching and entity extraction.
  • Hybrid Search: Supports HNSWlib-based nearest neighbor search and BM25 hybrid search for efficient query handling.
  • Dynamic Routing: Easily define conversational routes with decorators.
  • Parameter Extraction: Automatically extract parameters from user input with flexible type handling.
  • LLM Integration: Integrates with popular LLMs such as OpenAI, Gemini, and LlamaAPI for extended conversational capabilities.
  • Conversation Management: Supports multi-session conversation handling with unique session IDs.

Installation

Install the package via pip:

pip install chatsapi

Usage

Initializing the Framework

from chatsapi import ChatsAPI

chat = ChatsAPI(
    llm_type="gemini",  # Choose LLM type (e.g., gemini, openai, ollama)
    llm_model="models/gemini-pro",  # Specify model
    llm_api_key="YOUR_API_KEY"  # API key for the LLM
)

Registering Routes

Define conversational routes using decorators. Routes map user inputs to specific handler functions.

@chat.trigger("Want to cancel a credit card.")
@chat.extract([("card_number", "Credit card number (a 12-digit number)", str, None)])
async def cancel_credit_card(chat_message: str, extracted: dict):
    return {"message": chat_message, "extracted": extracted}

Explanation:

  • @chat.trigger: Registers the route with a user-friendly description.
  • @chat.extract: Automatically extracts parameters from user input.

Running the Chat API

Use the run method to handle user inputs.

async def main():
    response = await chat.run("I want to cancel my credit card.")
    print(response)

Full Example: FastAPI Integration

from fastapi import FastAPI, Request, Response
from pydantic import BaseModel
from chatsapi import ChatsAPI

app = FastAPI()
chat = ChatsAPI(
    llm_type="gemini",
    llm_model="models/gemini-pro",
    llm_api_key="YOUR_API_KEY",
)

@chat.trigger("Need help with account settings.")
@chat.extract([
    ("account_number", "Account number (a 9-digit number)", int, None),
    ("holder_name", "Account holder's name (a person name)", str, None)
])
async def account_help(chat_message: str, extracted: dict):
    return {"message": chat_message, "extracted": extracted}

class RequestModel(BaseModel):
    message: str

@app.post("/chat")
async def message(request: RequestModel):
    reply = await chat.run(request.message)
    return {"response": reply}

Advanced: Conversation Management

ChatsAPI supports multi-session conversations using unique session IDs:

session_id = chat.set_session()  # Start a new session
response = await chat.conversation("Tell me about my account", session_id)
print(response)

chat.end_session(session_id)  # End the session

Supported LLMs

  • OpenAI (ChatGPT)
  • Gemini
  • LlamaAPI
  • Ollama

Technical Details

  • SBERT: Used for creating sentence embeddings.
  • HNSWlib: Provides fast approximate nearest neighbor search.
  • BM25: Implements Okapi BM25 for token-based matching.
  • SpaCy: Handles natural language parsing and entity recognition.

logical_flow.png

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contribution

Contributions are welcome! Feel free to open issues or submit pull requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatsapi-0.1.0.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

chatsapi-0.1.0-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file chatsapi-0.1.0.tar.gz.

File metadata

  • Download URL: chatsapi-0.1.0.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.1

File hashes

Hashes for chatsapi-0.1.0.tar.gz
Algorithm Hash digest
SHA256 40960cc1dafaa0e9700e6ae11e718cd43702dadfd864717802246fcc3d109212
MD5 e365a50393bcace57779a6e80f055989
BLAKE2b-256 6cc4ef0b78bf6ff5ef51ab8ec57e88733e6d328e56714da250369f51b98d6f05

See more details on using hashes here.

File details

Details for the file chatsapi-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: chatsapi-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.1

File hashes

Hashes for chatsapi-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bb2bffc33bc9e86755a4f5f3af1016f83b18caacf46dde5d7590b4a735d4d478
MD5 c9a7be03d17fe52f3e6dbbf66b1ca0f3
BLAKE2b-256 1dd7bfa4ff963b2aafba20e007e46e2c02470546c65427060477926d7c2ff21d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page