Skip to main content

KISS AI Stack's AI Agent Builder - Simplify AI Agent Development

Project description

KISS AI Stack Banner

KISS AI Stack - Core

Effortless AI Agent Building

Welcome to the core of the KISS AI Stack! This module helps you build an AI agent effortlessly using a simple YAML configuration file. Say goodbye to boilerplate code and embrace minimalism with the KISS principle (Keep It Simple, Stupid).


Features

  • Centralized Agent Management: Manage multiple session-based AI agents with lifecycle support.
  • Minimal Dependencies: Built using simple, vanilla vendor libraries.
  • Tool Classification: Configure tools for your agent to handle specific tasks easily.
  • Supports RAG and Prompt-Based Models: Choose the model type that suits your needs.
  • Thread-Safe: Reliable operation in multi-threaded environments.

Installation

Install the core module using pip:

pip install kiss-ai-stack-core

Example Configuration

Here’s an example YAML configuration to set up an AI agent with different tools:

agent:
  classifier: # Required for tool classification
    name: decision_maker
    role: classify tools for given queries
    kind: prompt  # Choose from 'rag' or 'prompt'
    ai_client:
      provider: openai
      model: gpt-4
      api_key: <your-api-key>

  tools:
    - name: general_queries
      role: process other queries if no suitable tool is found.
      kind: prompt
      ai_client:
        provider: openai
        model: gpt-4
        api_key: <your-api-key>

    - name: document_tool
      role: process documents and provide answers based on them.
      kind: rag  # Retrieval-Augmented Generation
      embeddings: text-embedding-ada-002
      ai_client:
        provider: openai
        model: gpt-4
        api_key: <your-api-key>

  vector_db:
    provider: chroma
    kind: remote # Choose in-memory, storage or remote options.
    host: 0.0.0.0
    port: 8000
    secure: false

Example Python Usage

Use the core module to build and interact with your AI agent:

from kiss_ai_stack import AgentStack

async def main():
    try:
        # Initialize an agent in the stack
        await AgentStack.bootstrap_agent(agent_id="my_agent", temporary=True)

        # Process a query
        response = await AgentStack.generate_answer(agent_id="my_agent", query="What is KISS AI Stack?")
        print(response.answer)

    except Exception as ex:
        print(f"An error occurred: {ex}")

# Run the example
import asyncio
asyncio.run(main())

How It Works

  1. Agent Initialization: Use AgentStack.bootstrap_agent to initialize agents with their configuration and resources.
  2. Query Processing: Process queries with AgentStack.generate_answer, leveraging tools and AI clients defined in the YAML configuration.
  3. Tool Management: Define tools to handle specific tasks like document processing or query classification.
  4. Vector Database: Use the vector_db section to define how document embeddings are stored and retrieved for RAG-based tasks. Currently, Chroma is supported.

Documentation

Key Methods

  • bootstrap_agent(agent_id: str, temporary: bool): Initialize a new agent session.
  • generate_answer(agent_id: str, query: Union[str, Dict, List]): Process a query and return a response.

Configuration Highlights

  • AI Client: Configure the provider, model, and API key for supported services like OpenAI.
  • Tools: Define tools such as general-purpose query handlers or document processors.
  • Vector Database: Set up in-memory or persistent storage for RAG-based tasks.

Contributing

We welcome contributions! Submit pull requests or open issues to improve this stack.


License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kiss_ai_stack_core-0.1.0a11.tar.gz (24.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kiss_ai_stack_core-0.1.0a11-py3-none-any.whl (32.8 kB view details)

Uploaded Python 3

File details

Details for the file kiss_ai_stack_core-0.1.0a11.tar.gz.

File metadata

  • Download URL: kiss_ai_stack_core-0.1.0a11.tar.gz
  • Upload date:
  • Size: 24.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.0 CPython/3.12.7

File hashes

Hashes for kiss_ai_stack_core-0.1.0a11.tar.gz
Algorithm Hash digest
SHA256 01db3f667185848dd2808a0d2489bd1a3e64b1da29fe48b918f8d17c5a75d958
MD5 0ae93dfcf50c195470a22b3668b2bbf0
BLAKE2b-256 1e9106a377c3d6212c40bec8b45b4f66b6e6c0cfb595c266b0bad718c2a4886f

See more details on using hashes here.

File details

Details for the file kiss_ai_stack_core-0.1.0a11-py3-none-any.whl.

File metadata

File hashes

Hashes for kiss_ai_stack_core-0.1.0a11-py3-none-any.whl
Algorithm Hash digest
SHA256 129d7e250dc668bbeae0b4ff0e5bf593fc39b3a23c7989d1c5ce2514c5e7cde1
MD5 4fd9c23e7ec1ad14951f4d6c618e389e
BLAKE2b-256 dd295a4acc49d1de0b4c5614b45614da33f1c2d6ab42ebc318455b5e9796993d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page