Skip to main content

Rakam Systems - Modular AI framework with agents, vectorstore, and LLM gateway

Project description

🏴‍☠️ Overview 🏴‍☠️

rakam_systems is a Python library that provides a framework to easily build & deploy AI and Generative AI systems.

You can build any System by combining Components, where you can either use some from our library or completely customise them. Both ways, they come with a suite of features built for production such as integrated evaluation and automatic deployment on your preferred cloud solution. We like to think of it as the child between Haystack & Terraform-OpenTofu.

🥵 Problem Statement

Building custom AI and Gen AI systems can be challenging due to the need for flexibility, scalability, and production-readiness. Developers often face problems like:

  • Complexity: Creating AI systems from scratch is complex, especially when combining different technologies for model management, data processing, and integration.

  • Scalability: Ensuring that AI systems can handle large-scale data and provide efficient, real-time responses.

  • Integration: Standardizing and fluid data communication between the different core components of an AI System, especially when deployed on different servers.

  • Maintenance & Updates: The AI landscape evolves rapidly, and maintaining systems with the latest models and technologies is challenging, stressful and costly.

rakam_systems addresses these challenges by offering a flexible, lean framework that helps developers build AI systems efficiently, while minimizing code maintenance overhead and focusing on ease of deployment.

✨ Key Features

  • Modular Framework: rakam_systems is a framework for building AI and Gen AI systems, with Components serving as the core building blocks.

  • Customizability: Designed to provide robust tools for developing custom Gen AI solutions. Many classes are abstract, offering flexibility while keeping the codebase streamlined by limiting predefined functionality to common use cases.

  • Production-Ready: Built for scalability and ease of deployment:

    • Libraries are chosen for their efficiency and scalability.
    • Components exchange data in a structured way, facilitating API integration.
    • Includes Docker/Django API templates for easy deployment: Service Template.
  • Lean Design: Focused on minimizing breaking changes and ensuring code fluidity.

  • Best-in-Class Supporting Tools & Approaches: We select the best libraries and technical approaches for each specific task to keep the codebase lean and manageable, addressing the challenge of evolving technologies. We welcome contributions to improve these approaches and are open to new ideas.

  • Selected Libraries:

    • Best LLM: OpenAI has the best models in the world and we've chosen it as the main LLM API OpenAI
    • EU LLM: Mistral AI is the best European model provider and will have lasting conformity to the AI Act. Mistral AI
    • Transformers & Models: Hugging Face was chosen for its extensive support for a wide range of pre-trained models and its active community. Hugging Face (HF)
    • Vector Stores: FAISS was selected for its efficiency and scalability in managing large-scale vector similarity searches. FAISS
    • File storage: While you can work with local files, we allow users to work with buckets using the S3 framework. S3
  • Engine Update: We also deploy regular Engine Updates to ensure that the library stays current with the latest advancements in AI, minimizing maintenance challenges.

Use Cases

With rakam_systems, you can build:

  • Retrieval-Augmented Generation (RAG) Systems: Combine vector retrieval with LLM prompt generation for enriched responses. Learn more

  • Agent Systems: Create modular agents that perform specific tasks using LLMs. Link to come

  • Chained Gen AI Systems: Develop systems that chain multiple AI tasks together for complex workflows. Link to come

  • Search Engines: Develop search engines based on fine-tunned embeddings models on any modality ( text, sound or video ). Link to come

  • Any Custom AI System: Use components to create any AI solution tailored to your needs.

Installation

To install rakam_systems, clone the repository and install it in editable mode to include all dependencies:

git clone <repository_url> rakam_systems
cd rakam_systems
pip install -e .

Dependencies

  • faiss
  • sentence-transformers
  • pandas
  • openai
  • pymupdf
  • playwright
  • joblib
  • requests

Examples

Check out the following links for detailed examples of what you can build using rakam_systems:

Core Components

rakam_systems provides several core components to facilitate building AI systems:

  • Vector Stores: Manage and query vector embeddings for fast retrieval.
  • Content Extraction: Extract data from PDFs, URLs, and JSON files.
  • Node Processing: Split content into smaller, manageable chunks.
  • Modular Agents: Implement custom tasks such as classification, prompt generation, and RAG.

For more details on how to use each of these components, please refer to the documentation here.

Contributing

We welcome contributions! To contribute:

  1. Fork the Repository: Start by forking the rakam_systems repository to your GitHub account.
  2. Clone the Forked Repository: Clone the forked repository to your local machine:
    git clone <forked_repository_url> rakam_systems
    cd rakam_systems
    
  3. Install in Editable Mode: Install rakam_systems in editable mode to make development easier:
    pip install -e .
    
  4. Create a Branch: Create a feature branch (git checkout -b feature-branch).
  5. Make Changes: Implement your changes and commit them with a meaningful message (git commit -m 'Add new feature').
  6. Push the Branch: Push your changes to your feature branch (git push origin feature-branch).
  7. Submit a Pull Request: Go to the original rakam_systems repository on GitHub and submit a pull request for review.

For more details, refer to the Contribution Guide.

License

This project is licensed under the Apache-2.0 license.

Support

For any issues, questions, or suggestions, please contact mohammed@rakam.ai.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rakam_systems-0.2.5rc12.tar.gz (87.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rakam_systems-0.2.5rc12-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file rakam_systems-0.2.5rc12.tar.gz.

File metadata

  • Download URL: rakam_systems-0.2.5rc12.tar.gz
  • Upload date:
  • Size: 87.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.6

File hashes

Hashes for rakam_systems-0.2.5rc12.tar.gz
Algorithm Hash digest
SHA256 dd5691f9326033ebeae5c183b6a815f624b1e0bcd9eb09df8525da6327813d1a
MD5 a08b0b4eb813d2ba06176c6b366991b1
BLAKE2b-256 e2fcd1d2a9f09f405dc35c779e820bce0442da7b77823b88bc0e23b3fe972f38

See more details on using hashes here.

File details

Details for the file rakam_systems-0.2.5rc12-py3-none-any.whl.

File metadata

File hashes

Hashes for rakam_systems-0.2.5rc12-py3-none-any.whl
Algorithm Hash digest
SHA256 c75eb2e1e3f4bc4d9c046f733db6c84c216545012bb04ee25bd3eb65194ba0fd
MD5 e5ea75605fdf1aa78aa667de612171fe
BLAKE2b-256 705b438ff54d46f2ffa3860b60158ea03c5de64c8e898898f343e1156ca61e39

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page