Skip to main content

Framework for building AI agents for Navigator

Project description

AI-Parrot

AI-Parrot is an async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Built on top of navigator-api, it provides a unified interface for interacting with various LLM providers, managing tools, conducting agent-to-agent (A2A) communication, and serving agents via the Model Context Protocol (MCP).

Whether you need a simple chatbot, a complex multi-agent orchestration workflow, or a robust production-ready AI service, AI-Parrot exposes the primitives to build it efficiently.

Monorepo Structure

AI-Parrot is organized as a monorepo with four packages:

Package PyPI Name Description
packages/ai-parrot ai-parrot Core framework: agents, clients, memory, orchestration
packages/ai-parrot-tools ai-parrot-tools Tool and toolkit implementations (Jira, AWS, Slack, etc.)
packages/ai-parrot-loaders ai-parrot-loaders Document loaders for RAG pipelines (PDF, YouTube, audio, etc.)
packages/ai-parrot-pipelines ai-parrot-pipelines Specialized pipelines such as planogram compliance workflows

The core package (ai-parrot) provides the base abstractions (AbstractTool, AbstractToolkit, @tool) and lightweight built-in tools. Heavy tool implementations, document loaders, and specialized pipelines are split into their own packages so you only install what you need.


Installation

Core framework

uv pip install ai-parrot

Quick Setup (CLI)

After installing, use the parrot CLI to configure your environment interactively:

# Interactive setup wizard — select LLM provider, enter API keys, generate .env
parrot setup

# Initialize configuration directory structure (env/ and etc/)
parrot conf init

The parrot setup wizard will guide you through:

  1. Selecting an LLM provider (OpenAI, Anthropic, Google, etc.)
  2. Entering your API credentials
  3. Writing them to the correct .env file
  4. Optionally creating a starter Agent and bootstrap files (app.py, run.py)

Additional CLI commands:

# Start an MCP server from a YAML config
parrot mcp --config server.yaml

# Deploy an autonomous agent as a systemd service
parrot autonomous create --agent my_agent.py
parrot autonomous install --agent my_agent.py --name my-agent

LLM Providers

Install only the providers you need:

# Google Gemini
uv pip install "ai-parrot[google]"

# OpenAI / GPT
uv pip install "ai-parrot[openai]"

# Anthropic / Claude (HTTP API client)
uv pip install "ai-parrot[anthropic]"

# Claude Code agent dispatch (bundled `claude` CLI subprocess)
uv pip install "ai-parrot[claude-agent]"

# Groq
uv pip install "ai-parrot[groq]"

# X.AI / Grok
uv pip install "ai-parrot[xai]"

# All LLM providers at once
uv pip install "ai-parrot[llms]"

Additional providers supported out of the box (no extra install needed):

  • HuggingFace (hf) — uses the HuggingFace Inference API
  • vLLM (vllm) — connects to a local vLLM server
  • OpenRouter (openrouter) — routes to any model via OpenRouter API
  • Ollama / Local — via OpenAI-compatible endpoints

Anthropic: API client vs. Claude Code agent dispatch

Anthropic ships in two independent extras — pick the one(s) you need:

Extra Installs Use case
ai-parrot[anthropic] anthropic[aiohttp]>=0.97.0 API client (AnthropicClient) — completion, vision, streaming, the Messages Batches API. Talks HTTP to api.anthropic.com.
ai-parrot[claude-agent] claude-agent-sdk>=0.1.68 (which bundles the claude CLI) Agent dispatch (ClaudeAgentClient) — delegates a task to a Claude Code sub-agent that can read files, run bash, call tools. Talks to a subprocess CLI.

The two extras are independent. Install only what you use:

# I just want to call the Anthropic API:
uv pip install "ai-parrot[anthropic]"

# I want to dispatch tasks to a Claude Code agent:
uv pip install "ai-parrot[claude-agent]"

# I want both:
uv pip install "ai-parrot[anthropic,claude-agent]"

After installing [claude-agent], register/authenticate the bundled CLI once — either run claude auth interactively, or export ANTHROPIC_API_KEY in the environment. The CLI honours either path.

A runnable demo lives in examples/clients/claude_agent_example.py.

Embeddings & Vector Stores

# Sentence transformers, FAISS, ChromaDB, etc.
uv pip install "ai-parrot[embeddings]"

Tools

# Install the tools package
uv pip install ai-parrot-tools

# Or with specific tool extras
uv pip install "ai-parrot-tools[jira]"
uv pip install "ai-parrot-tools[aws]"
uv pip install "ai-parrot-tools[slack]"
uv pip install "ai-parrot-tools[finance]"
uv pip install "ai-parrot-tools[all]"       # All tool dependencies

Available tool extras: jira, slack, aws, docker, git, analysis, excel, sandbox, codeinterpreter, pulumi, sitesearch, office365, scraping, finance, db, flowtask, google, arxiv, wikipedia, weather, messaging.

Document Loaders

# Install the loaders package
uv pip install ai-parrot-loaders

# Or with specific loader extras
uv pip install "ai-parrot-loaders[youtube]"
uv pip install "ai-parrot-loaders[pdf]"
uv pip install "ai-parrot-loaders[audio]"
uv pip install "ai-parrot-loaders[all]"     # All loader dependencies

Available loader extras: youtube, audio, pdf, web, ebook, video.

Pipelines

# Install the pipelines package
uv pip install ai-parrot-pipelines

Backward-compatible imports from parrot.pipelines continue to work when the package is installed.

Platform & Security Tools

AI-Parrot includes tools for cloud security auditing and infrastructure management. These tools rely on external Docker images that must be installed before use:

# Security tools
parrot install cloudsploit    # AWS security scanner (CloudSploit)
parrot install prowler        # Cloud security posture management

# Platform tools
parrot install pulumi         # Infrastructure as Code CLI

The parrot install command pulls and configures the required Docker containers automatically, so the tools are ready to be used by your agents.


Quick Start

Create a simple weather chatbot in just a few lines of code:

import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool

# 1. Define a tool
@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is Sunny, 25C"

async def main():
    # 2. Create the Agent
    bot = Chatbot(
        name="WeatherBot",
        llm="openai:gpt-4o",  # Provider:Model
        tools=[get_weather],
        system_prompt="You are a helpful weather assistant."
    )

    # 3. Configure (loads tools, connects to memory)
    await bot.configure()

    # 4. Chat!
    response = await bot.ask("What's the weather like in Madrid?")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())

Using LLM Clients Directly

Beyond the Chatbot abstraction, you can access any LLM provider client directly for lower-level operations like image generation, embeddings, or custom completion calls:

import asyncio
from parrot.clients.google.client import GoogleGenAIClient
from parrot.models.outputs import ImageGenerationPrompt
from parrot.models.google import GoogleModel

async def main():
    prompt = ImageGenerationPrompt(
        prompt="A realistic passport-style photo with white background",
        styles=["photorealistic", "high resolution"],
        model=GoogleModel.IMAGEN_3.value,
        aspect_ratio="16:9",
    )

    client = GoogleGenAIClient()
    async with client:
        response = await client.image_generation(prompt_data=prompt)
        for img_path in response.images:
            print(f"Image saved to: {img_path}")

if __name__ == "__main__":
    asyncio.run(main())

Each provider client (GoogleGenAIClient, OpenAIClient, AnthropicClient, etc.) implements AbstractClient and can be used as an async context manager. This gives you full access to provider-specific features — image generation, audio transcription, structured outputs — while still benefiting from AI-Parrot's unified configuration and credential management.


Running as a Server

AI-Parrot is not only a library — it is also a full aiohttp-based application server that exposes your agents as REST APIs, WebSocket endpoints, and more. This is powered by Navigator, an async web framework built on aiohttp.

How it works

When you run parrot setup, it generates two files:

  • app.py — Defines your application handler, registers agents with BotManager, and configures routes.
  • run.py — The entry point that starts the aiohttp server.

app.py (generated by parrot setup):

from parrot.manager import BotManager
from parrot.conf import STATIC_DIR
from parrot.handlers import AppHandler
from agents.my_agent import MyAgent


class Main(AppHandler):
    app_name: str = "Parrot"
    enable_static: bool = True
    staticdir: str = STATIC_DIR

    def configure(self) -> None:
        self.bot_manager = BotManager()
        self.bot_manager.register(MyAgent())
        self.bot_manager.setup(self.app)

run.py (generated by parrot setup):

from navigator import Application
from app import Main

app = Application(Main, enable_jinja2=True)

if __name__ == "__main__":
    app.run()

Built-in endpoints

Once the server starts, BotManager.setup() automatically registers these routes:

Endpoint Method Description
/api/v1/agents/chat/{agent_id} POST Chat with an agent (JSON, HTML, or Markdown response)
/api/v1/agents/chat/{agent_id} PATCH Configure tools/MCP servers for a session
/api/v1/bot_management GET List registered bots
/api/v1/bot_management/{bot} GET/POST/PATCH/DELETE CRUD operations on bots
/api/v1/agent_tools GET List available tools
/api/v1/ai/client GET LLM provider configuration
/ws/userinfo WebSocket Real-time user notifications

Starting the server

Development (single process, auto-reload):

python run.py

The server starts on http://0.0.0.0:5000 by default (configurable via APP_HOST / APP_PORT environment variables).

Production (Gunicorn with async workers):

# Install gunicorn
uv pip install "ai-parrot[deploy]"

# Run with aiohttp-compatible workers
gunicorn run:app \
    --worker-class aiohttp.worker.GunicornUVLoopWebWorker \
    --workers 4 \
    --bind 0.0.0.0:5000 \
    --timeout 360

The long timeout (360s) accommodates agent queries that involve multi-step tool execution or LLM calls.

Talking to your agents via REST

Once the server is running, any registered agent is accessible via HTTP:

# Chat with an agent
curl -X POST http://localhost:5000/api/v1/agents/chat/my-agent \
  -H "Content-Type: application/json" \
  -d '{"message": "What is the weather in Madrid?"}'

# Request markdown output
curl -X POST "http://localhost:5000/api/v1/agents/chat/my-agent?output_format=markdown" \
  -H "Content-Type: application/json" \
  -d '{"message": "Summarize the latest news"}'

Architecture

AI-Parrot is designed with a modular architecture enabling agents to be both consumers and providers of tools and services.

graph TD
    User["User / Client"] --> API["AgentTalk Handlers"]
    API --> Bot["Chatbot / BaseBot"]

    subgraph "Agent Core"
        Bot --> Memory["Memory / Vector Store"]
        Bot --> LLM["LLM Client (OpenAI/Anthropic/Etc)"]
        Bot --> TM["Tool Manager"]
    end

    subgraph "Tools & Capabilities"
        TM --> LocalTools["Local Tools (@tool)"]
        TM --> Toolkits["Toolkits (OpenAPI/Custom)"]
        TM --> MCPServer["External MCP Servers"]
    end

    subgraph "Connectivity"
        Bot -.-> A2A["A2A Protocol (Client/Server)"]
        Bot -.-> MCP["MCP Protocol (Server)"]
        Bot -.-> Integrations["Telegram / MS Teams"]
    end

    subgraph "Orchestration"
        Crew["AgentCrew"] --> Bot
        Crew --> OtherBots["Other Agents"]
    end

Core Concepts

Agents (Chatbot)

The Chatbot class is your main entry point. It handles conversation history, RAG (Retrieval-Augmented Generation), and the tool execution loop.

bot = Chatbot(
    name="MyAgent",
    model="anthropic:claude-3-5-sonnet-20240620",
    enable_memory=True
)

Tools

Functional Tools (@tool)

The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.

from parrot.tools import tool

@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
    """Calculate VAT for a given amount."""
    return amount * rate

Class-Based Toolkits (AbstractToolkit)

Group related tools into a reusable class. All public async methods become tools.

from parrot.tools import AbstractToolkit

class MathToolkit(AbstractToolkit):
    async def add(self, a: int, b: int) -> int:
        """Add two numbers."""
        return a + b

    async def multiply(self, a: int, b: int) -> int:
        """Multiply two numbers."""
        return a * b

OpenAPI Toolkit (OpenAPIToolkit)

Dynamically generate tools from any OpenAPI/Swagger specification.

from parrot.tools import OpenAPIToolkit

petstore = OpenAPIToolkit(
    spec="https://petstore.swagger.io/v2/swagger.json",
    service="petstore"
)

# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())

Orchestration (AgentCrew)

Orchestrate multiple agents to solve complex tasks using AgentCrew.

Supported Modes:

  • Sequential: Agents run one after another, passing context.
  • Parallel: Independent tasks run concurrently.
  • Flow: DAG-based execution defined by dependencies.
  • Loop: Iterative execution until a condition is met.
from parrot.bots.orchestration import AgentCrew

crew = AgentCrew(
    name="ResearchTeam",
    agents=[researcher_agent, writer_agent]
)

# Define a Flow — Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)

await crew.run_flow("Research the latest advancements in Quantum Computing")

Scheduling (@schedule)

Give your agents agency to run tasks in the background.

from parrot.scheduler import schedule, ScheduleType

class DailyBot(Chatbot):
    @schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
    async def morning_briefing(self):
        news = await self.ask("Summarize today's top tech news")
        await self.send_notification(news)

Connectivity & Exposure

Agent-to-Agent (A2A) Protocol

Agents can discover and talk to each other using the A2A protocol.

Expose an Agent:

from parrot.a2a import A2AServer

a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")

Consume an Agent:

from parrot.a2a import A2AClient

async with A2AClient("https://remote-agent.com") as client:
    response = await client.send_message("Hello from another agent!")

Model Context Protocol (MCP)

AI-Parrot has first-class support for MCP.

Consume MCP Servers:

mcp_servers = [
    MCPServerConfig(
        name="filesystem",
        command="npx",
        args=["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
    )
]
await bot.setup_mcp_servers(mcp_servers)

Expose Agent as MCP Server: Allow Claude Desktop or other MCP clients to use your agent as a tool.

Platform Integrations

Expose your bots natively to chat platforms:

  • Telegram
  • Microsoft Teams
  • Slack
  • WhatsApp

Optional capabilities

Dev-Loop Orchestration

Optional. Requires the [claude-agent] extra: pip install ai-parrot[claude-agent]

A 5-node AgentsFlow that fixes "small operational bugs" automatically:

BugIntake → Research → Development → QA → DeploymentHandoff
                                       │
                                       └─(qa failed / hard error)→ FailureHandler

The flow takes a Pydantic BugBrief (Jira ticket + log sources + acceptance criteria) and produces a PR plus a Jira ticket transitioned to "Ready to Deploy". Failures escalate back to the original reporter.

Prerequisites

  • Python 3.11+ with ai-parrot[claude-agent] installed.
  • claude-agent-sdk >= 0.1.68 and either ANTHROPIC_API_KEY or a configured claude CLI on PATH.
  • Redis 6+ for two-stream observability (one stream per flow run plus one per dispatch).
  • Jira service-account credentials wrapped in a parrot.auth.credentials.StaticCredentialResolver.
  • (Optional) gh CLI for PR creation. Falls back to a direct GitHub REST call (using GITHUB_TOKEN + GITHUB_REPOSITORY) when the CLI is missing.

Configuration (navconfig)

Setting Default Purpose
CLAUDE_CODE_MAX_CONCURRENT_DISPATCHES 3 Cap on concurrent Claude Code dispatches (dispatcher-side semaphore).
FLOW_MAX_CONCURRENT_RUNS 5 Cap on concurrent flow runs (orchestrator-side).
FLOW_BOT_JIRA_ACCOUNT_ID "" Jira accountId of the service-account bot. Must be set per environment.
WORKTREE_BASE_PATH .claude/worktrees Base directory for per-feature worktrees. The dispatcher refuses any cwd outside this path.
FLOW_STREAM_TTL_SECONDS 604800 Retention for both flow and dispatch Redis streams (7 days).
ACCEPTANCE_CRITERION_ALLOWLIST ["flowtask","pytest","ruff","mypy","pylint"] Allowed ShellCriterion command heads. Validated at intake.

Quickstart

from parrot.flows.dev_loop import (
    ClaudeCodeDispatcher,
    build_dev_loop_flow,
    register_pull_request_webhook,
)

dispatcher = ClaudeCodeDispatcher(
    max_concurrent=3,
    redis_url="redis://localhost:6379/0",
    stream_ttl_seconds=604800,
)
flow = build_dev_loop_flow(
    dispatcher=dispatcher,
    jira_toolkit=jira,                 # already wrapping flow-bot creds
    log_toolkits={"cloudwatch": cw, "elasticsearch": es},
    redis_url="redis://localhost:6379/0",
)
register_pull_request_webhook(orchestrator, secret=GITHUB_WEBHOOK_SECRET)
# Then run via your AutonomousOrchestrator with a BugBrief in ctx.

Live observability

The dispatcher publishes per-event DispatchEvent envelopes (queued, started, message, tool_use, tool_result, output_invalid, failed, completed) to Redis Streams. The parrot.flows.dev_loop.flow_stream_ws aiohttp handler exposes a WebSocket endpoint that fans-in the flow stream and every dispatch stream into a single envelope per event for the UI to consume — the UI never speaks Redis directly.


Supported LLM Providers

Provider Extra Identifier Example
OpenAI openai openai openai:gpt-4o
Anthropic anthropic anthropic, claude anthropic:claude-sonnet-4-20250514
Google Gemini google google google:gemini-3.1-flash-lite-preview
Groq groq groq groq:llama-3.3-70b-versatile
X.AI / Grok xai grok grok:grok-3
HuggingFace (included) hf hf:meta-llama/Llama-3-8B
vLLM (included) vllm vllm:model-name
OpenRouter (included) openrouter openrouter:anthropic/claude-sonnet-4
Ollama (included) via OpenAI endpoint

Contributing

Development setup (from source)

AI-Parrot uses uv as its package manager and provides a Makefile to simplify common tasks.

git clone https://github.com/phenobarbital/ai-parrot.git
cd ai-parrot

# Create the virtual environment (Python 3.11)
make venv
source .venv/bin/activate

# Full dev install — all packages, all extras, dev tools
make develop

# Run tests
make test

Makefile targets

The Makefile covers the entire development lifecycle. Run make help for the full list.

Development install variants:

Target What it installs
make develop All packages + all extras + dev tools (full environment)
make develop-fast All packages, base deps only (no torch/tensorflow/whisperx)
make develop-ml Embeddings + audio loaders (heavy ML stack)

Production install variants:

Target What it installs
make install All packages, base deps only (no extras)
make install-core Core with LLM clients + vector stores
make install-tools Core + tools with common extras (jira, slack, aws, etc.)
make install-tools-all Core + tools with ALL extras
make install-loaders Core + loaders with common extras (youtube, web, pdf)
make install-loaders-all Core + loaders with ALL extras (includes whisperx, pyannote)
make install-all Everything with ALL extras

Other useful targets:

make format          # Format code with black
make lint            # Lint with pylint + black --check
make test            # Run pytest + mypy
make build           # Build all packages (sdist + wheel)
make release         # Build + publish to PyPI
make lock            # Regenerate uv.lock
make clean           # Remove build artifacts
make generate-registry  # Regenerate TOOL_REGISTRY from source
make bump-patch      # Bump patch version (syncs across all packages)

Manual install (without Make)

If you prefer not to use Make:

uv venv --python 3.11 .venv
source .venv/bin/activate

# Full install
uv sync --all-packages --all-extras

# Or selective extras
uv sync --extra google --extra openai

Project layout

ai-parrot/
├── packages/
│   ├── ai-parrot/           # Core framework
│   │   └── src/parrot/
│   ├── ai-parrot-tools/     # Tool implementations
│   │   └── src/parrot_tools/
│   └── ai-parrot-loaders/   # Document loaders
│       └── src/parrot_loaders/
├── tests/
├── examples/
├── Makefile                  # Build, install, test, release shortcuts
└── pyproject.toml            # Workspace root

Releasing to PyPI

AI-Parrot publishes three packages on every GitHub release:

Package PyPI Project Build Method
ai-parrot ai-parrot cibuildwheel (Cython + Rust/Maturin)
ai-parrot-tools ai-parrot-tools uv build (pure Python)
ai-parrot-loaders ai-parrot-loaders uv build (pure Python)

The release workflow (.github/workflows/release.yml) runs 3 parallel build jobs and a single deploy job:

release event
    ├── build-core   — cibuildwheel for ai-parrot (Cython + Rust)
    ├── build-tools  — uv build for ai-parrot-tools
    ├── build-loaders — uv build for ai-parrot-loaders
    └── deploy       — twine upload all artifacts to PyPI

To create a release:

  1. Bump the version in each package's pyproject.toml (or use make bump-patch to sync all three).
  2. Create a GitHub release — the workflow triggers automatically on the release: created event.

First-time PyPI setup (required once):

  • Create ai-parrot-tools and ai-parrot-loaders projects on PyPI under the same account as ai-parrot.
  • Ensure the NAV_AIPARROT_API_SECRET GitHub secret holds a PyPI API token with upload scope for all 3 projects. A scoped token per project or a single account-level token both work.

Independent versioning:

Each package has its own version number in its pyproject.toml. All three are built and published on the same release event — there is no requirement to keep versions in sync.


Guidelines

  • All code must be async-first — no blocking I/O in async contexts
  • Use type hints and Google-style docstrings on all public APIs
  • Use Pydantic models for structured data
  • Run pytest after any logic change
  • Tools with heavy dependencies must use lazy imports to avoid bloating the core

Issues & Support


DB-Persisted Bot Configuration (FEAT-133)

Added in FEAT-133. Spec: sdd/specs/bot-reranker-and-parent-searcher-config.spec.md.

BotManager._load_database_bots reads two new JSONB columns on navigator.ai_bots and wires the resulting objects into every DB-loaded bot at startup. Both columns default to '{}'::JSONB, so existing rows are unaffected.

Reranker config (reranker_config)

Controls cross-encoder or LLM-based reranking of vector-store candidates (FEAT-126). Empty {} means no reranking.

Local cross-encoder (no live LLM call; requires sentence-transformers):

{
  "type": "local_cross_encoder",
  "model_name": "cross-encoder/ms-marco-MiniLM-L-12-v2",
  "device": "cpu",
  "rerank_oversample_factor": 4
}

LLM reranker (uses the bot's own LLM client):

{
  "type": "llm",
  "client_ref": "bot",
  "rerank_oversample_factor": 4
}

Factory: parrot.rerankers.factory.create_reranker(config, *, bot_llm_client=None)

Parent-searcher config (parent_searcher_config)

Controls parent-document expansion after vector search (FEAT-128). Empty {} means no expansion.

In-table parent search (chunk row has a parent_id FK in the same table):

{
  "type": "in_table",
  "expand_to_parent": true
}

expand_to_parent is also forwarded as a constructor kwarg so the bot's retrieval logic can branch on it before calling the searcher.

Factory: parrot.stores.parents.factory.create_parent_searcher(config, *, store)

store (bot.store) becomes available only after await bot.configure(app). The factory is therefore called after configure(), not before.

Error handling

An unknown type value raises parrot.exceptions.ConfigError at bot startup. The bot is not silently registered without its configured features.

# A row with {"type": "magic"} in reranker_config will raise:
# ConfigError: unknown reranker type 'magic'; supported: local_cross_encoder, llm

Order of operations

create_reranker(reranker_config)           # before bot construction
bot = BotClass(..., reranker=reranker, expand_to_parent=...)
await bot.configure(app)                   # store becomes available
create_parent_searcher(config, store=bot.store)
bot.parent_searcher = parent_searcher

License

MIT


Built with care by the AI-Parrot Team

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_parrot-0.24.43.tar.gz (3.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ai_parrot-0.24.43-cp314-cp314-musllinux_1_2_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.14musllinux: musl 1.2+ x86-64

ai_parrot-0.24.43-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.43-cp313-cp313-musllinux_1_2_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ x86-64

ai_parrot-0.24.43-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.43-cp312-cp312-musllinux_1_2_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ x86-64

ai_parrot-0.24.43-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.43-cp311-cp311-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ x86-64

ai_parrot-0.24.43-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

File details

Details for the file ai_parrot-0.24.43.tar.gz.

File metadata

  • Download URL: ai_parrot-0.24.43.tar.gz
  • Upload date:
  • Size: 3.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ai_parrot-0.24.43.tar.gz
Algorithm Hash digest
SHA256 f5f9604e0b5c8097ec3709bdeb075df21e8ac650421de338b10c6ab62568c851
MD5 c2d021cc2fad41b3e47b14190de474fe
BLAKE2b-256 431c43f2d41e9b30ee93d443c8ff5f4e88722d3b2443b3263597f8a4a1644760

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43.tar.gz:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp314-cp314-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp314-cp314-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 0f009feec408ba02c3930418e158560d025b93b6c059101134b0f2920b8400c0
MD5 7152e5a5a8d3cdb68bee09e29b6940c1
BLAKE2b-256 cba199d8a00d63f465b60df57f941f02418855ff6afb2385918e96de0eec0167

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp314-cp314-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 9ec25ee95573b02450364b425582cba8f2e31ad1a40a5150cd38b11aa7772b6d
MD5 070f0f49595a011b05198ded6aae0ec1
BLAKE2b-256 b01d2f4ded52b8b33eb630f7ec2d1f5567fdd5b79c8339552497dd7da2e0e3f0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp313-cp313-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp313-cp313-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 55b520e775099d361b507b542272ca896ad6ea4179c6e22938adeb80181515fd
MD5 7b9ecd6971a07673eaa889f196c9e246
BLAKE2b-256 895589bbfbc5f1aeec0dad0a32cff1de410c58fc540d32254b67e7f7824516f7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp313-cp313-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 166fda9268facf1dbbc2244afe25f8ad9ddb498efb2825f349990f5b6a46b3c3
MD5 b7e479c3107f238af5b8f84090423cd4
BLAKE2b-256 a3af5fec85d68bcc6211eb79f7a11cecb732f7f234cb560172334912c7162f94

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp312-cp312-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp312-cp312-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 a5c9a8e97949b9912a577d39a1d05a0af5c16f154da0d5cccec474afd6831b13
MD5 7dc5b2f6c756c6e3bacf9b0b6f77d6dc
BLAKE2b-256 b92bc91e4d2d666ba4729c45c98bf9df31f8ec52976f059e472bec9def94c637

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp312-cp312-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f36df7f510d9524f8e5dd11d8d86790bf8b9e86c94cd87be368fdb654c4a30f4
MD5 f4526d42e9de1a25b91bc669aed93bfa
BLAKE2b-256 1b83ab6a51faf92c614b6419ba10d91d9e293f94e1337a46a2f1b0b28fed971c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp311-cp311-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp311-cp311-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 7b88ac5f138c4e36e123e0ef10a5441f9a2b1d7509081abcfceec7770c95f742
MD5 5cdc03e5c21a02a2af7b3c60d3cb4470
BLAKE2b-256 bce85c503d16cf68e743b518090180e0a6baa38ec34c5de82c9ecd61a5ad7e72

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp311-cp311-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.43-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.43-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d593640289366c0cbdca6cfbc7793dba055838ec5dd09af685ae6aa63d937ae4
MD5 d785b194a968141f9c4e93c2dc07f07e
BLAKE2b-256 ab9304636de93460c16dd7384caa6b9ae3ea688edda5899b4ec85a12922a7a4e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.43-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page