Skip to main content

`LLMfy` is a framework for developing applications with large language models (LLMs).

Project description

llmfy llmfy llmfy python

LLMfy is a flexible and developer-friendly framework designed to streamline the creation of applications powered by large language models (LLMs). It provides essential tools and abstractions that simplify the integration, orchestration, and management of LLMs across various use cases, enabling developers to focus on building intelligent, context-aware solutions without getting bogged down in low-level model handling. With support for modular components, prompt engineering, and extensibility, LLMfy accelerates the development of AI-driven applications from prototyping to production.

See complete documentation at https://llmfy.readthedocs.io/

How to install

  • Optional Library:
    • Install openai to use OpenAI models — 🔸 optional.
    • Install boto3 to use AWS Bedrock models — 🔸 optional.
    • Install numpy to use Embedding, FAISSVectorStore — 🔸 optional.
    • Install faiss-cpu to use FAISSVectorStore — 🔸 optional.
    • Install typing_extensions to use state in FlowEngine — 🔸 optional.
    • Install redis to use RedisCheckpointer — 🔸 optional.
    • Install SQLAlchemy to use SQLCheckpointer — 🔸 optional. SQLCheckpointer supports both sync and async drivers for multiple databases:

Using UV

uv add llmfy

Using pip

pip install llmfy

Using github

From a specific branch

# main
uv add git+https://github.com/irufano/llmfy.git@main
# or
pip install git+https://github.com/irufano/llmfy.git@main

# dev
uv add git+https://github.com/irufano/llmfy.git@dev
# or
pip install git+https://github.com/irufano/llmfy.git@dev

From a tag

# example tag version 0.4.3
uv add git+https://github.com/irufano/llmfy.git@v0.4.3
# or
pip install git+https://github.com/irufano/llmfy.git@v0.4.3

Github in requirements.txt

git+https://github.com/irufano/llmfy.git@dev

How to use

OpenAI models

To use OpenAIModel, requires install "llmfy[openai]" and add below config to your env:

  • OPENAI_API_KEY

AWS Bedrock models

To use BedrockModel, requires install "llmfy[boto3]" and add below config to your env:

  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_BEDROCK_REGION

Google AI models

To use GoogleAIModel, requires install "llmfy[google-genai]" and add below config to your env:

  • GOOGLE_API_KEY

Example

LLMfy Example

from llmfy import (
    OpenAIModel,
    OpenAIConfig,
    LLMfy,
    Message,
    Role,
    LLMfyException,
)

def sample_prompt():
    info = """Irufano adalah seorang software engineer.
    Dia berasal dari Indonesia.
    Kamu bisa mengunjungi websitenya di https:://irufano.github.io"""

    # Configuration
    config = OpenAIConfig(temperature=0.7)
    llm = OpenAIModel(model="gpt-4o-mini", config=config)

    SYSTEM_PROMPT = """Answer any user questions based solely on the data below:
    <data>
    {info}
    </data>
    
    DO NOT response outside context."""

    # Initialize framework
    framework = LLMfy(llm, system_message=SYSTEM_PROMPT, input_variables=["info"])

    try:
        messages = [Message(role=Role.USER, content="apa ibukota china")]
       
        response = framework.invoke(messages, info=info)
        print(f"\n>> {response.result.content}\n")

    except LLMfyException as e:
        print(f"{e}")


if __name__ == "__main__":
    sample_prompt()

Develop as Contributor

Build package

uv build

Trigger build and deploy to PyPI

# TAG_NAME must start with "v" (e.g., v1.0.0)
git tag -a [TAG_NAME] -m "[TAG_MESSAGE]"

# push tag to remote
git push origin [TAG_NAME]

After deploy on local

After the CI moves the tag, your local tag still points to the old commit. To sync:

git fetch --tags --force

The --force flag is needed because git fetch --tags alone won't update tags that already exist locally.

Package Development on local

uv sync --group dev --group docs

or

uv sync --all-groups

Mkdocs run on local

uv sync --group docs
# Serve on local
mkdocs serve

# Build docs
mkdocs build

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmfy-0.5.1.tar.gz (71.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmfy-0.5.1-py3-none-any.whl (111.8 kB view details)

Uploaded Python 3

File details

Details for the file llmfy-0.5.1.tar.gz.

File metadata

  • Download URL: llmfy-0.5.1.tar.gz
  • Upload date:
  • Size: 71.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llmfy-0.5.1.tar.gz
Algorithm Hash digest
SHA256 41dab5026d5a160abf1f2af7fbf8c5abaa3efddfcc4dbca995ddb32f8d67d162
MD5 1ba80f50626866033aab446cbef45555
BLAKE2b-256 83503b85983ab8f7911491c2b2d702019b59a628973433736f83180921cd60c0

See more details on using hashes here.

File details

Details for the file llmfy-0.5.1-py3-none-any.whl.

File metadata

  • Download URL: llmfy-0.5.1-py3-none-any.whl
  • Upload date:
  • Size: 111.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llmfy-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9f170d5131ed24b31206c5fd0effdadc6c13e048444f84c330d21ce5ef505b02
MD5 157b507caf35bd95db7a86e60c48cc96
BLAKE2b-256 0711f36ef07cf554e7dc5dc8de04baaa677e7b067f664a4ba51403fd9cd549ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page