Skip to main content

Empowering GenAI teams to maintain peak model accuracy in production environments.

Project description

Future AGI

Company Logo

Welcome to FutureAGI - Empowering GenAI Teams with Advanced Performance Management

Overview

FutureAGI provides a cutting-edge platform designed to help GenAI teams maintain peak model accuracy in production environments. Our solution is purpose-built, scalable, and delivers results 10x faster than traditional methods.

Key Features

  • Simplified GenAI Performance Management: Streamline your workflow and focus on developing cutting-edge AI models.
  • Instant Evaluation: Score outputs without human-in-the-loop or ground truth, increasing QA team efficiency by up to 10x.
  • Advanced Error Analytics: Gain ready-to-use insights with comprehensive error tagging and segmentation.
  • Configurable Metrics: Define custom metrics tailored to your specific use case for precise model evaluation.

Quickstart

Installation

To install the client, you can clone the repository or install the library:

Install the library in an environment using Python >= 3.6.

$ pip3 install futureagi

Or clone the repo:

$ git clone https://github.com/future-agi/client

Initialisation

To initialise the Future agi Client, you need to provide your api_key and secret_key, which are associated with your Future agi account.

Get your service API key When you create an account, we generate a service API key. You will need this API Key and your Space Key for logging authentication. Instrument your code Python Client If you are using the Future Agi python client, add a few lines to your code to log your data. Logs are sent to us asynchronously.

from fi.client import Client

api_key = os.environ["FI_API_KEY"]
secret_key = os.environ["FI_SECRET_KEY"]
base_url = os.environ["FI_API_URL"]

client = Client(api_key=api_key, secret_key=secret_key,
        uri=base_url,
        max_workers=8,
        max_queue_bound=5000,
        timeout=200,
        additional_headers=None,
)

Initializes the Futureagi Client

  • api_key: provided API key associated with your account.
  • secret_key:provided identifier to connect records to spaces.
  • uri: URI to send your records to Futureagi client.
  • max_workers: maximum number of concurrent requests to Futureagi. Defaults to 8.
  • max_queue_bound: maximum number of concurrent future objects generated for publishing to Futureagi. Defaults to 5000.
  • timeout: how long to wait for the server to send data before giving up. Defaults to 200.
  • additional_headers: Dictionary of additional headers to append to request

You can also set these keys as environment variables:

export FI_API_KEY=your_api_key
export FI_SECRET_KEY=your_secret_key

And then initialise the client without passing the keys directly:

from fi.utils.types import ModelTypes, Environments

client.log(
    model_id="your_model_id",
    model_type=ModelTypes.GENERATIVE_LLM,
    environment=Environments.PRODUCTION,
    model_version="1.0.0",
    prediction_timestamp=1625216400,
    conversation={
        "chat_history": [
            {"role": "user", "content": "How do I implement a neural network in Python?"}
        ]
    },
    tags={"project": "AI project"}
)

Parameters

  • model_id: The ID of the model. Must be a string.
  • model_type: The type of the model. Must be an instance of ModelTypes.
  • environment: The environment in which the model is running. Must be an instance of Environments.
  • model_version: The version of the model. Must be a string.
  • prediction_timestamp: (Optional) The timestamp of the prediction. Must be an integer.
  • conversation: The conversation data. Must be a dictionary containing either chat_history or chat_graph.
  • tags: (Optional) Additional tags for the event. Must be a dictionary.

For full details, see our docs.

Conversation Format

Chat History The chat_history must be a list of dictionaries with the following keys:

  • role: The role of the participant (e.g., “user”, “assistant”). Must be a string.
  • content: The content of the message. Must be a string.
  • context: (Optional) The context of the message. Must be a list of pairs of strings in the format [["", ""]...].

Chat History with conversation ID The chat_history must be a list of dictionaries with the following keys:

  • conversation_id: The ID of the conversation. Must be a string.
  • role: The role of the participant (e.g., “user”, “assistant”). Must be a string.
  • content: The content of the message. Must be a string.
  • context: (Optional) The context of the message. Must be a list of pairs of strings in the format [["", ""]...].

Chat Graph The chat_graph must be a dictionary with the following keys:

  • conversation_id: The ID of the conversation. Must be a string.
  • nodes: A list of nodes, each containing:
  • message: A dictionary with the message details.
  • node_id: The ID of the node. Must be a string.
  • parent_id: The ID of the parent node. Must be a string.
  • timestamp: The timestamp of the node. Must be an integer.
  1. Logging data individually: For example, "chat_history" may include a list of dictionaries where each dictionary represents a message with attributes like "role" (str) and "content" (str) .
{
        "chat_history": [
            {
                "role": "user",
                "content": "Who won the world series in 2020?"
            },
            {
                "role": "assistant",
                "content": "The Los Angeles Dodgers won the World Series in 2020."
            }
        ]
}
  1. Logging data all at once: This involves logging structured conversations in a unified format:
[{
    "conversation_id": "",
    "title": "",
    "root_node": "",
    "metadata": {},
    "nodes": [{
        "parent_node": "",
        "child_node": "",
        "message": {
            "id": "",
            "author": {
                        "role": "assistant",
                        "metadata": {}
                    },
            "content": {
                        "content_type": "text",
                        "parts": [
                            "The user is interested to do this task..."
                        ]
                    }
            "context": ""
        }
    }]
}]
    

Error Handling The client raises specific exceptions for different types of errors:

  • AuthError: Raised if the API key or secret key is missing.
  • InvalidAdditionalHeaders: Raised if there are conflicting additional headers.
  • InvalidValueType: Raised if a parameter has an invalid type.
  • InvalidSupportedType: Raised if a model type is not supported.
  • MissingRequiredKey: Raised if a required key is missing.
  • InvalidVectorLength: Raised if the vector length is invalid.

FAQ’s:

  1. Q: How do you give a performance score without human in the loop?

Our secret Sauce is a Critique AI agent that cana deliver powerful evaluation framework without need for human in the loop. What’s more is that it is 100% configurable as per new evolving use cases. Now anything that you can imagine your AI system should deliver - you can configure our platform to manage it.

  1. Q: What all inputs FutureAGI platform needs?

We would need only the input-output database, training dataset if available, and User-analytics. We do not need to understand the model and how it is taking decisions.

  1. Q: I don't want to share data with Future AGI, can I still use it?

Yes, you can now install our SDK in your private cloud and take advantage of our strong platform to align your Ai system to your users.

  1. Q: My use case is unique, would you provide service to customise your platform as per my use case?

Our platform if 100% customisable and easy to configure for all types of models and modalities by the AI teams. However, our customer-success engineer would be happy to assist you for figuring out solutions to your unique use cases.

  1. Q: My app uses multiple models with multiple modalities, can you work with images and videos also?

Yes we can.

  1. Q: How much time does it take to integrate the Future AGI platform? How much bandwidth would be required?

It takes just 2 minutes to integrate a few lines of code and your data starts showing on our platform. Try it today.

Resources

Website: https://www.futureagi.com/

Documentation: https://docs.futureagi.com/

PyPI : https://pypi.org/project/futureagi/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

futureagi-0.2.2.tar.gz (17.1 kB view hashes)

Uploaded Source

Built Distribution

futureagi-0.2.2-py3-none-any.whl (15.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page