Skip to main content

A library for speaking with large langauge models using object orientated generation to build data-rich agentic systems

Project description

Welcome

Welcome to funkyprompt. This is a lightweight library for building large language model based agent systems by the principle of Object Orientated Generation (OOG). This is a simple idea that says that we only need to focus on objects to build agentic systems. The goal of funkyprompt is to enable seamless development of data-rich agentic systems.

There are actually only two abstractions that are important for working with large language model APIs;

  1. The message stack, a collection of messages with specific roles. The system role may be considered special in some models/apis
  2. The function stack, a list of functions often described in a Json Schema, that can be called by the language model

In funkyprompt both of these stacks are always treated as dynamic inside a Runner's execution loop. Reasoning is carried out by starting with a system prompt (always rendered as clean Markdown) and following the trail of function calls until completion. It should be possible to activate and recruit new functions during the execution loop.

Objects are represents by Pydantic or Markdown and there is an invertible mapping between these two representations.

Object Orientated Generation observes three things;

  1. Top level metadata or doc string on the Pydantic object for the system level prompt
  2. Fields with descriptions that manage structured output as well as additional prompting in data-rich systems
  3. Class methods or auxillary API methods defined for the type for use in function calling

It is important to recognize that you do not need anything else to build agentic systems over large language model APIs. By combing such objects with the two managed stacks (function and message), this creates a computationally complete framework.

Here is a trivially simple example object (agent)...

from pydantic import Field
from funkyprompt.core.agents import Runner, Plan, CallingContext, AbstractModel
from IPython.display import Markdown

class TestObject(AbstractModel):
    """You are a simple agent that answers the users question with the help of functions. 
Please respond in a structured format with fenced json. Use the response format provided.
"""
             
    person: str = Field(description="A person that is of interest to the user")
    color: str = Field(description="A color that the person likes")
    object_of_color: str = Field(description="An object that is the color of the persons favorite color")
        
    @classmethod
    def favorite_color(cls, person:str):
        """
        For three people Bob, Henrik and Ursula, you can ask for the favorite color and get an answer 
        
        Args:
            person: the name of the person
        """
        
        return {
            "bob": "green",
            "henrik": "yellow",
            "ursula": "red"
        }.get(person.lower())
    
    @classmethod
    def objects_of_color(cls, color:str):
        """
        For a given color, get an object of this color
        
        Args:
            color: the color
        """
        
        return {
            "green": "turtle",
            "yellow": "cup",
            "red": "car"
        }.get(color.lower())
    
agent = Runner(TestObject)
#use GPT by default or other models like claude or gemini
Markdown(agent("Tell me about henrik",
      #CallingContext(model='claude-3-5-sonnet-20240620')
      #CallingContext(model='gemini-1.5-pro-exp-0827')
     ))

This example illustrates that Agents are always described as Pydantic objects including holding callable functions. Not shown here, the configuration can add references to external functions i.e. OpenAPI endpoints. Although this shows the Pydantic object, a lot of emphasis is put on treating the Markdown representation as the primary representation of the agent. This allows for portability because markdown agents can be described in a database or web server and when run on clients it is still possible to run functions etc. You can get the Markdown representation with TestObject.get_model_description()

Installation

Funkyprompt is a poetry library that you can clone or install locally. Its also installable via PyPi

pip install funkyprompt

To initialize the data you should run the below. This will do things like register the graph database in your instance and register the core types of objects to get you started. You can use funkyprompt without support for data IO (CRUD).

import funkyprompt
funkyprompt.init()

You can register any object in the database with TestObject._register(allow_create_schema=True). This generates tables for each of your entity types should you wish to enable CRUD. To learn more about data in funkyprompt see below.

Data (AIDA)

Funkyprompt is primarily used for RAG use cases. More generally, because the notion of RAG may be restrictive for us, we are interested in an AI Data Architecture (AIDA). AIDA considers design patterns for adding semantics over existing databases and APIs to make them "AI-ready". It considers multi-modal data types. While Vector data plays a role, structured SQL and NoSQL data are also important. Key-Value for example is critical for entity resolution use cases. Generally we use a graph database to manage both key-value and relations.

Funkyprompt leans heavily on semantic modelling in Pydantic. The data layer is intended to be opaquely used. For example we will dump entities or search entities using the schema defined in Pydantic models or agents without requiring any additional code. Any more sophisticated data interaction could be implemented with custom functions or APIs but there are some core CRUD use cases that are built in to funkyprompt.

There are a couple of key AIDA principles;

  1. The first is the semantic layer. Your entire system is described semantically by a system of OOG agents. Each of these agents defines schema that can optionally allow CRUD. Over time its expected that many types including both structured, text, image and graph relations will build up in your system and this is the main purpose of funkyprompt - to enable data rich agentic systems seamlessly.
  2. Query routing is the other important part of AIDA. Mixing graph, key-value, structured and unstructured data creates a need to make the right choice for the right use case as to what types of queries to rely on. Funkyprompt "indexes" data in different ways i.e. adding embeddings or graph relationships to support different query types in context.

There are two options for data in funkyprompt...

Embedded

Embedded data uses embedded databases like DuckDB and Kuzu (for graph). LanceDB can be used for Vector embeddings but because DuckDB supports vectors now, DuckDB may be the preferred choice in future.

To use embedded as default storage type instead of postgres set FP_DEFAULT_STORAGE=Embedded

Postgres

The recommended database is Postgres because it can support all data types and offers maturity. pg_vector and AGE (graph) extensions means we can use a single database for all modes. Installing postgres is easy - example you can ask GPT for instructions to set it up in your environment along with the extensions mentioned. It should take on the order of 10 minutes including building time. If you dont want to use postgres you can set the environment to not use it as a default.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

funkyprompt-0.5.22.tar.gz (97.9 kB view details)

Uploaded Source

Built Distribution

funkyprompt-0.5.22-py3-none-any.whl (118.8 kB view details)

Uploaded Python 3

File details

Details for the file funkyprompt-0.5.22.tar.gz.

File metadata

  • Download URL: funkyprompt-0.5.22.tar.gz
  • Upload date:
  • Size: 97.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.10 Darwin/22.4.0

File hashes

Hashes for funkyprompt-0.5.22.tar.gz
Algorithm Hash digest
SHA256 6ac6f8886ada22cdfb374b4b110faa9a44a84a3a4f3659301c544bb646f88fc3
MD5 5063b8559ac15781e8b5678dc64e666d
BLAKE2b-256 b2a3c26622ad2982e9cfd71bb8c844ad3103f81b7d8cc6a9d2a181edf809d5e4

See more details on using hashes here.

File details

Details for the file funkyprompt-0.5.22-py3-none-any.whl.

File metadata

  • Download URL: funkyprompt-0.5.22-py3-none-any.whl
  • Upload date:
  • Size: 118.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.10 Darwin/22.4.0

File hashes

Hashes for funkyprompt-0.5.22-py3-none-any.whl
Algorithm Hash digest
SHA256 430b160f2c85975bfad52e018bd1983b9ae5bdba4af8f60cf4158e18d62486e2
MD5 f7f562ecaa6c7e48e011d4fda2ea7eee
BLAKE2b-256 9bed1a609556a2acd4932139f328ead77d45e84d6c70a48dac40698c9551c846

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page