Skip to main content

Rakam Systems - Modular AI framework with agents, vectorstore, and LLM gateway

Project description

🏴‍☠️ Overview 🏴‍☠️

rakam_systems is a Python library that provides a framework to easily build & deploy AI and Generative AI systems.

You can build any System by combining Components, where you can either use some from our library or completely customise them. Both ways, they come with a suite of features built for production such as integrated evaluation and automatic deployment on your preferred cloud solution. We like to think of it as the child between Haystack & Terraform-OpenTofu.

🥵 Problem Statement

Building custom AI and Gen AI systems can be challenging due to the need for flexibility, scalability, and production-readiness. Developers often face problems like:

  • Complexity: Creating AI systems from scratch is complex, especially when combining different technologies for model management, data processing, and integration.

  • Scalability: Ensuring that AI systems can handle large-scale data and provide efficient, real-time responses.

  • Integration: Standardizing and fluid data communication between the different core components of an AI System, especially when deployed on different servers.

  • Maintenance & Updates: The AI landscape evolves rapidly, and maintaining systems with the latest models and technologies is challenging, stressful and costly.

rakam_systems addresses these challenges by offering a flexible, lean framework that helps developers build AI systems efficiently, while minimizing code maintenance overhead and focusing on ease of deployment.

✨ Key Features

  • Modular Framework: rakam_systems is a framework for building AI and Gen AI systems, with Components serving as the core building blocks.

  • Customizability: Designed to provide robust tools for developing custom Gen AI solutions. Many classes are abstract, offering flexibility while keeping the codebase streamlined by limiting predefined functionality to common use cases.

  • Production-Ready: Built for scalability and ease of deployment:

    • Libraries are chosen for their efficiency and scalability.
    • Components exchange data in a structured way, facilitating API integration.
    • Includes Docker/Django API templates for easy deployment: Service Template.
  • Lean Design: Focused on minimizing breaking changes and ensuring code fluidity.

  • Best-in-Class Supporting Tools & Approaches: We select the best libraries and technical approaches for each specific task to keep the codebase lean and manageable, addressing the challenge of evolving technologies. We welcome contributions to improve these approaches and are open to new ideas.

  • Selected Libraries:

    • Best LLM: OpenAI has the best models in the world and we've chosen it as the main LLM API OpenAI
    • EU LLM: Mistral AI is the best European model provider and will have lasting conformity to the AI Act. Mistral AI
    • Transformers & Models: Hugging Face was chosen for its extensive support for a wide range of pre-trained models and its active community. Hugging Face (HF)
    • Vector Stores: FAISS was selected for its efficiency and scalability in managing large-scale vector similarity searches. FAISS
    • File storage: While you can work with local files, we allow users to work with buckets using the S3 framework. S3
  • Engine Update: We also deploy regular Engine Updates to ensure that the library stays current with the latest advancements in AI, minimizing maintenance challenges.

Use Cases

With rakam_systems, you can build:

  • Retrieval-Augmented Generation (RAG) Systems: Combine vector retrieval with LLM prompt generation for enriched responses. Learn more

  • Agent Systems: Create modular agents that perform specific tasks using LLMs. Link to come

  • Chained Gen AI Systems: Develop systems that chain multiple AI tasks together for complex workflows. Link to come

  • Search Engines: Develop search engines based on fine-tunned embeddings models on any modality ( text, sound or video ). Link to come

  • Any Custom AI System: Use components to create any AI solution tailored to your needs.

Installation

To install rakam_systems, clone the repository and install it in editable mode to include all dependencies:

git clone <repository_url> rakam_systems
cd rakam_systems
pip install -e .

Dependencies

  • faiss
  • sentence-transformers
  • pandas
  • openai
  • pymupdf
  • playwright
  • joblib
  • requests

Examples

Check out the following links for detailed examples of what you can build using rakam_systems:

Core Components

rakam_systems provides several core components to facilitate building AI systems:

  • Vector Stores: Manage and query vector embeddings for fast retrieval.
  • Content Extraction: Extract data from PDFs, URLs, and JSON files.
  • Node Processing: Split content into smaller, manageable chunks.
  • Modular Agents: Implement custom tasks such as classification, prompt generation, and RAG.

For more details on how to use each of these components, please refer to the documentation here.

Contributing

We welcome contributions! To contribute:

  1. Fork the Repository: Start by forking the rakam_systems repository to your GitHub account.
  2. Clone the Forked Repository: Clone the forked repository to your local machine:
    git clone <forked_repository_url> rakam_systems
    cd rakam_systems
    
  3. Install in Editable Mode: Install rakam_systems in editable mode to make development easier:
    pip install -e .
    
  4. Create a Branch: Create a feature branch (git checkout -b feature-branch).
  5. Make Changes: Implement your changes and commit them with a meaningful message (git commit -m 'Add new feature').
  6. Push the Branch: Push your changes to your feature branch (git push origin feature-branch).
  7. Submit a Pull Request: Go to the original rakam_systems repository on GitHub and submit a pull request for review.

For more details, refer to the Contribution Guide.

License

This project is licensed under the Apache-2.0 license.

Support

For any issues, questions, or suggestions, please contact mohammed@rakam.ai.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rakam_systems-0.2.5rc9.tar.gz (87.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rakam_systems-0.2.5rc9-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file rakam_systems-0.2.5rc9.tar.gz.

File metadata

  • Download URL: rakam_systems-0.2.5rc9.tar.gz
  • Upload date:
  • Size: 87.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.6

File hashes

Hashes for rakam_systems-0.2.5rc9.tar.gz
Algorithm Hash digest
SHA256 0b3e03bfa5f7ff0cc661038c404be5ba2e2956a66cd202d32fd829e24367bfec
MD5 1f3d1de4762e5201765fab11451542b2
BLAKE2b-256 ed0d4bb4a9edb731e297cd1e10c51bf41cfa9356a7cb8a6fcea05795fa02762a

See more details on using hashes here.

File details

Details for the file rakam_systems-0.2.5rc9-py3-none-any.whl.

File metadata

File hashes

Hashes for rakam_systems-0.2.5rc9-py3-none-any.whl
Algorithm Hash digest
SHA256 27f2d2ccd3d6c06179d15ee0450efb5515be2e00ec916f2634fa477ef6b5e50b
MD5 e5d0791c8a9cbf44d33270cac9662166
BLAKE2b-256 f6f6edcb61890794599c2118ac83648883c43e52bb75ee306db172b93e72703a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page