Skip to main content

Prompt-defined Python functions with lazy or eager LLM-backed generation.

Project description

PyFuncAI

CI GitHub Release PyPI PyPI - Python Version License: Apache-2.0 Code style: black

PyFuncAI is a Python library for defining functions with prompts and letting an LLM generate the implementation on demand.

It is built as a normal PyPI-style package, exposes a small public API, supports local Ollama models and remote OpenAI/Gemini providers, and can cache generated source on disk for reuse across runs.

Branch Strategy

The repository is set up around three long-lived branches:

  • main: production and stable release branch, including PyPI deployment
  • preview: prerelease branch for release candidates and GitHub prereleases
  • dev: active development branch

The GitHub Actions workflows are wired to run CI on all three branches. Stable PyPI publishing is restricted to version tags that point to commits on main, while preview tags publish GitHub prereleases from preview.

What It Does

PyFuncAI turns a prompt plus a Python signature into a callable object:

  • mode="lazy" delays generation until first use.
  • mode="eager" builds immediately.
  • cache=True stores generated source on disk.
  • .build() forces materialization ahead of time.
  • .source exposes the generated Python source once available.

The generated code is validated before execution:

  • exactly one top-level function must be returned
  • the generated signature must exactly match the requested signature
  • only approved standard-library imports are allowed
  • obvious dangerous builtins such as open, eval, and exec are blocked

This is still experimental. Validation is intentionally conservative, but LLM-generated code should still be treated as untrusted.

Installation

Once published:

pip install pyfuncai

For local development from this repo:

pip install -e .[dev]

To run the minimal example directly from the repository checkout:

python example.py

Automation

GitHub Actions workflows live in .github/workflows/:

  • ci.yml: formatting, tests, coverage, and package builds on main, preview, dev, and pull requests
  • release.yml: stable releases only, for tags like v1.2.3 on main, with GitHub Release + PyPI publish
  • prerelease.yml: preview releases only, for tags like v1.2.3rc1 on preview, with GitHub prerelease artifacts only

The CI workflow still uploads coverage.xml, but the README no longer exposes a Codecov badge.

Quick Start

Ollama

This was smoke-tested locally on March 13, 2026 with:

  • Ollama at http://localhost:11434
  • model qwen3.5:latest
from pyfuncai import connect, create_function

connect(
    "ollama",
    model="qwen3.5:latest",
    base_url="http://localhost:11434",
    timeout=180,
)

greet = create_function(
    "Return a short greeting for the provided name.",
    signature="(name: str) -> str",
    function_name="greet",
    mode="eager",
    cache=True,
)

print(greet("Alice"))
print(greet.source)

OpenAI

from pyfuncai import connect, create_function

connect(
    "openai",
    model="gpt-5-mini",
    api_key="YOUR_OPENAI_API_KEY",
)

slugify = create_function(
    "Convert text into a URL-friendly slug.",
    signature="(text: str) -> str",
    function_name="slugify",
)

OPENAI_API_KEY is also supported.

Gemini

from pyfuncai import connect, create_function

connect(
    "gemini",
    model="gemini-2.5-flash",
    api_key="YOUR_GEMINI_API_KEY",
)

summarize = create_function(
    "Summarize a short paragraph in one sentence.",
    signature="(text: str) -> str",
    function_name="summarize",
)

GEMINI_API_KEY is also supported.

Public API

connect(provider, **config)

Registers the default provider used by later create_function() calls.

Supported providers:

  • ollama
  • openai
  • gemini

Common configuration keys:

  • model
  • timeout
  • cache_dir

Provider-specific keys:

  • Ollama: base_url
  • OpenAI: api_key, base_url
  • Gemini: api_key, base_url

create_function(...)

Main arguments:

  • prompt
  • signature
  • cache=True
  • mode="lazy" or mode="eager"
  • function_name
  • provider
  • cache_dir
  • system_prompt
  • temperature
  • max_output_tokens
  • allow_modules

It returns a GeneratedFunction, which is callable and also exposes:

  • .build(force=False)
  • .source
  • .cache_key
  • .is_built

createFunction(...) is available as a compatibility alias.

Cache Behavior

Cached source is keyed by:

  • prompt
  • requested signature
  • function name
  • provider identity
  • mode
  • allowed module list

By default PyFuncAI uses an OS-appropriate user cache directory. You can override that globally in connect(..., cache_dir=...) or per function with create_function(..., cache_dir=...).

Live Testing

The unit test suite is offline and deterministic:

pytest

An optional live Ollama integration test is included but skipped by default:

PYFUNCAI_RUN_OLLAMA_TESTS=1 pytest tests/test_ollama_live.py

PowerShell:

$env:PYFUNCAI_RUN_OLLAMA_TESTS = "1"
pytest tests/test_ollama_live.py

Optional environment variables for the live test:

  • PYFUNCAI_OLLAMA_MODEL
  • PYFUNCAI_OLLAMA_BASE_URL
  • PYFUNCAI_OLLAMA_TIMEOUT

Project Layout

.github/workflows/
  ci.yml
  prerelease.yml
  release.yml
src/pyfuncai/
  __init__.py
  cache.py
  compiler.py
  core.py
  exceptions.py
  prompts.py
  providers.py
  validation.py
tests/
example.py

The package uses a src/ layout and pyproject.toml, which keeps it ready for normal wheel and sdist publishing.

Safety Notes

PyFuncAI validates generated code, but it does not provide hard sandboxing. The current implementation focuses on:

  • AST checks
  • restricted imports
  • restricted builtins during execution

That helps, but it is not the same thing as secure isolation. Do not use generated code against secrets, production systems, or privileged environments without stronger containment.

Status

Current state of the repository:

  • distributable package layout via pyproject.toml
  • Ollama/OpenAI/Gemini provider adapters
  • lazy and eager generation
  • disk cache
  • validation and restricted execution
  • unit tests and optional live Ollama test
  • CI, stable release automation, preview prereleases, and PyPI publishing workflow

Publishing To PyPI

PyFuncAI is configured for PyPI trusted publishing with GitHub Actions. The release workflow file is:

  • .github/workflows/release.yml

1. Configure GitHub

In the GitHub repository:

  1. Create or confirm the long-lived branches: main, preview, and dev.
  2. In Settings -> Environments, create an environment named pypi.
  3. Optionally add protection rules so only approved maintainers can publish.
  4. Push the workflow files to GitHub.

Recommended:

  • protect main
  • protect preview
  • create stable tags like v0.1.3 only from main
  • create preview tags like v0.2.0rc1 only from preview

2. Configure PyPI Trusted Publishing

In PyPI, add a new trusted publisher using the GitHub tab with these values:

  • PyPI Project Name: pyfuncai
  • Owner: AaronCreor
  • Repository name: PyFuncAI
  • Workflow name: release.yml
  • Environment name: pypi

This matches the current repository and workflow layout.

3. Publish A Release

From a clean main branch state:

  1. Update version in pyproject.toml.
  2. Commit and push to main.
  3. Create and push a version tag:
git checkout main
git pull
git tag v0.1.3
git push origin main
git push origin v0.1.3

That tag triggers release.yml, which will:

  • verify the tag commit is reachable from main
  • build the wheel and sdist
  • create a GitHub Release with the built artifacts
  • publish the package to PyPI through trusted publishing

4. Publish A Preview Prerelease

From the preview branch:

  1. Set version in pyproject.toml to a prerelease value such as 0.2.0rc1.
  2. Commit and push to preview.
  3. Create and push a matching prerelease tag:
git checkout preview
git pull
git tag v0.2.0rc1
git push origin preview
git push origin v0.2.0rc1

That tag triggers prerelease.yml, which will:

  • verify the tag commit is reachable from preview
  • verify the tag matches pyproject.toml
  • build the wheel and sdist
  • create a GitHub prerelease

It does not publish preview builds to PyPI.

5. Create The Extra Branches On GitHub

Local branches now exist, but GitHub will not see them until you push them:

git push -u origin preview
git push -u origin dev

References

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyfuncai-0.1.3.tar.gz (24.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyfuncai-0.1.3-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file pyfuncai-0.1.3.tar.gz.

File metadata

  • Download URL: pyfuncai-0.1.3.tar.gz
  • Upload date:
  • Size: 24.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pyfuncai-0.1.3.tar.gz
Algorithm Hash digest
SHA256 b2335d191bc62b67ee8d9ebae57ade86acfb993374fb15c8e351196f7fb7ef03
MD5 31151b900a0f0c79fef328d01ff0f61a
BLAKE2b-256 4a70753ff63fc8f91cfe3dbeb39797681ac8fa3ff513242c00d0c6d34ffa6d8b

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyfuncai-0.1.3.tar.gz:

Publisher: release.yml on AaronCreor/PyFuncAI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pyfuncai-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: pyfuncai-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pyfuncai-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e3e62b6d3376f895dd770fa5bb8b4db88e715aab332eefc0cb713cd4107172cc
MD5 50d799b4eaeb6be63d546b20a35c2a30
BLAKE2b-256 3d0618db8e635b883e5f3412ec4278a17c9dbe5c3ebb8c16ce10e926c580040a

See more details on using hashes here.

Provenance

The following attestation bundles were made for pyfuncai-0.1.3-py3-none-any.whl:

Publisher: release.yml on AaronCreor/PyFuncAI

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page