Skip to main content

Framework for building AI agents for Navigator

Project description

AI-Parrot

AI-Parrot is an async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Built on top of navigator-api, it provides a unified interface for interacting with various LLM providers, managing tools, conducting agent-to-agent (A2A) communication, and serving agents via the Model Context Protocol (MCP).

Whether you need a simple chatbot, a complex multi-agent orchestration workflow, or a robust production-ready AI service, AI-Parrot exposes the primitives to build it efficiently.

Monorepo Structure

AI-Parrot is organized as a monorepo with four packages:

Package PyPI Name Description
packages/ai-parrot ai-parrot Core framework: agents, clients, memory, orchestration
packages/ai-parrot-tools ai-parrot-tools Tool and toolkit implementations (Jira, AWS, Slack, etc.)
packages/ai-parrot-loaders ai-parrot-loaders Document loaders for RAG pipelines (PDF, YouTube, audio, etc.)
packages/ai-parrot-pipelines ai-parrot-pipelines Specialized pipelines such as planogram compliance workflows

The core package (ai-parrot) provides the base abstractions (AbstractTool, AbstractToolkit, @tool) and lightweight built-in tools. Heavy tool implementations, document loaders, and specialized pipelines are split into their own packages so you only install what you need.


Installation

Core framework

uv pip install ai-parrot

Quick Setup (CLI)

After installing, use the parrot CLI to configure your environment interactively:

# Interactive setup wizard — select LLM provider, enter API keys, generate .env
parrot setup

# Initialize configuration directory structure (env/ and etc/)
parrot conf init

The parrot setup wizard will guide you through:

  1. Selecting an LLM provider (OpenAI, Anthropic, Google, etc.)
  2. Entering your API credentials
  3. Writing them to the correct .env file
  4. Optionally creating a starter Agent and bootstrap files (app.py, run.py)

Additional CLI commands:

# Start an MCP server from a YAML config
parrot mcp --config server.yaml

# Deploy an autonomous agent as a systemd service
parrot autonomous create --agent my_agent.py
parrot autonomous install --agent my_agent.py --name my-agent

LLM Providers

Install only the providers you need:

# Google Gemini
uv pip install "ai-parrot[google]"

# OpenAI / GPT
uv pip install "ai-parrot[openai]"

# Anthropic / Claude (HTTP API client)
uv pip install "ai-parrot[anthropic]"

# Claude Code agent dispatch (bundled `claude` CLI subprocess)
uv pip install "ai-parrot[claude-agent]"

# Groq
uv pip install "ai-parrot[groq]"

# X.AI / Grok
uv pip install "ai-parrot[xai]"

# All LLM providers at once
uv pip install "ai-parrot[llms]"

Additional providers supported out of the box (no extra install needed):

  • HuggingFace (hf) — uses the HuggingFace Inference API
  • vLLM (vllm) — connects to a local vLLM server
  • OpenRouter (openrouter) — routes to any model via OpenRouter API
  • Ollama / Local — via OpenAI-compatible endpoints

Anthropic: API client vs. Claude Code agent dispatch

Anthropic ships in two independent extras — pick the one(s) you need:

Extra Installs Use case
ai-parrot[anthropic] anthropic[aiohttp]>=0.97.0 API client (AnthropicClient) — completion, vision, streaming, the Messages Batches API. Talks HTTP to api.anthropic.com.
ai-parrot[claude-agent] claude-agent-sdk>=0.1.68 (which bundles the claude CLI) Agent dispatch (ClaudeAgentClient) — delegates a task to a Claude Code sub-agent that can read files, run bash, call tools. Talks to a subprocess CLI.

The two extras are independent. Install only what you use:

# I just want to call the Anthropic API:
uv pip install "ai-parrot[anthropic]"

# I want to dispatch tasks to a Claude Code agent:
uv pip install "ai-parrot[claude-agent]"

# I want both:
uv pip install "ai-parrot[anthropic,claude-agent]"

After installing [claude-agent], register/authenticate the bundled CLI once — either run claude auth interactively, or export ANTHROPIC_API_KEY in the environment. The CLI honours either path.

A runnable demo lives in examples/clients/claude_agent_example.py.

Embeddings & Vector Stores

# Sentence transformers, FAISS, ChromaDB, etc.
uv pip install "ai-parrot[embeddings]"

Tools

# Install the tools package
uv pip install ai-parrot-tools

# Or with specific tool extras
uv pip install "ai-parrot-tools[jira]"
uv pip install "ai-parrot-tools[aws]"
uv pip install "ai-parrot-tools[slack]"
uv pip install "ai-parrot-tools[finance]"
uv pip install "ai-parrot-tools[all]"       # All tool dependencies

Available tool extras: jira, slack, aws, docker, git, analysis, excel, sandbox, codeinterpreter, pulumi, sitesearch, office365, scraping, finance, db, flowtask, google, arxiv, wikipedia, weather, messaging.

Document Loaders

# Install the loaders package
uv pip install ai-parrot-loaders

# Or with specific loader extras
uv pip install "ai-parrot-loaders[youtube]"
uv pip install "ai-parrot-loaders[pdf]"
uv pip install "ai-parrot-loaders[audio]"
uv pip install "ai-parrot-loaders[all]"     # All loader dependencies

Available loader extras: youtube, audio, pdf, web, ebook, video.

Pipelines

# Install the pipelines package
uv pip install ai-parrot-pipelines

Backward-compatible imports from parrot.pipelines continue to work when the package is installed.

Platform & Security Tools

AI-Parrot includes tools for cloud security auditing and infrastructure management. These tools rely on external Docker images that must be installed before use:

# Security tools
parrot install cloudsploit    # AWS security scanner (CloudSploit)
parrot install prowler        # Cloud security posture management

# Platform tools
parrot install pulumi         # Infrastructure as Code CLI

The parrot install command pulls and configures the required Docker containers automatically, so the tools are ready to be used by your agents.


Quick Start

Create a simple weather chatbot in just a few lines of code:

import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool

# 1. Define a tool
@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is Sunny, 25C"

async def main():
    # 2. Create the Agent
    bot = Chatbot(
        name="WeatherBot",
        llm="openai:gpt-4o",  # Provider:Model
        tools=[get_weather],
        system_prompt="You are a helpful weather assistant."
    )

    # 3. Configure (loads tools, connects to memory)
    await bot.configure()

    # 4. Chat!
    response = await bot.ask("What's the weather like in Madrid?")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())

Using LLM Clients Directly

Beyond the Chatbot abstraction, you can access any LLM provider client directly for lower-level operations like image generation, embeddings, or custom completion calls:

import asyncio
from parrot.clients.google.client import GoogleGenAIClient
from parrot.models.outputs import ImageGenerationPrompt
from parrot.models.google import GoogleModel

async def main():
    prompt = ImageGenerationPrompt(
        prompt="A realistic passport-style photo with white background",
        styles=["photorealistic", "high resolution"],
        model=GoogleModel.IMAGEN_3.value,
        aspect_ratio="16:9",
    )

    client = GoogleGenAIClient()
    async with client:
        response = await client.image_generation(prompt_data=prompt)
        for img_path in response.images:
            print(f"Image saved to: {img_path}")

if __name__ == "__main__":
    asyncio.run(main())

Each provider client (GoogleGenAIClient, OpenAIClient, AnthropicClient, etc.) implements AbstractClient and can be used as an async context manager. This gives you full access to provider-specific features — image generation, audio transcription, structured outputs — while still benefiting from AI-Parrot's unified configuration and credential management.


Running as a Server

AI-Parrot is not only a library — it is also a full aiohttp-based application server that exposes your agents as REST APIs, WebSocket endpoints, and more. This is powered by Navigator, an async web framework built on aiohttp.

How it works

When you run parrot setup, it generates two files:

  • app.py — Defines your application handler, registers agents with BotManager, and configures routes.
  • run.py — The entry point that starts the aiohttp server.

app.py (generated by parrot setup):

from parrot.manager import BotManager
from parrot.conf import STATIC_DIR
from parrot.handlers import AppHandler
from agents.my_agent import MyAgent


class Main(AppHandler):
    app_name: str = "Parrot"
    enable_static: bool = True
    staticdir: str = STATIC_DIR

    def configure(self) -> None:
        self.bot_manager = BotManager()
        self.bot_manager.register(MyAgent())
        self.bot_manager.setup(self.app)

run.py (generated by parrot setup):

from navigator import Application
from app import Main

app = Application(Main, enable_jinja2=True)

if __name__ == "__main__":
    app.run()

Built-in endpoints

Once the server starts, BotManager.setup() automatically registers these routes:

Endpoint Method Description
/api/v1/agents/chat/{agent_id} POST Chat with an agent (JSON, HTML, or Markdown response)
/api/v1/agents/chat/{agent_id} PATCH Configure tools/MCP servers for a session
/api/v1/bot_management GET List registered bots
/api/v1/bot_management/{bot} GET/POST/PATCH/DELETE CRUD operations on bots
/api/v1/agent_tools GET List available tools
/api/v1/ai/client GET LLM provider configuration
/ws/userinfo WebSocket Real-time user notifications

Starting the server

Development (single process, auto-reload):

python run.py

The server starts on http://0.0.0.0:5000 by default (configurable via APP_HOST / APP_PORT environment variables).

Production (Gunicorn with async workers):

# Install gunicorn
uv pip install "ai-parrot[deploy]"

# Run with aiohttp-compatible workers
gunicorn run:app \
    --worker-class aiohttp.worker.GunicornUVLoopWebWorker \
    --workers 4 \
    --bind 0.0.0.0:5000 \
    --timeout 360

The long timeout (360s) accommodates agent queries that involve multi-step tool execution or LLM calls.

Talking to your agents via REST

Once the server is running, any registered agent is accessible via HTTP:

# Chat with an agent
curl -X POST http://localhost:5000/api/v1/agents/chat/my-agent \
  -H "Content-Type: application/json" \
  -d '{"message": "What is the weather in Madrid?"}'

# Request markdown output
curl -X POST "http://localhost:5000/api/v1/agents/chat/my-agent?output_format=markdown" \
  -H "Content-Type: application/json" \
  -d '{"message": "Summarize the latest news"}'

Architecture

AI-Parrot is designed with a modular architecture enabling agents to be both consumers and providers of tools and services.

graph TD
    User["User / Client"] --> API["AgentTalk Handlers"]
    API --> Bot["Chatbot / BaseBot"]

    subgraph "Agent Core"
        Bot --> Memory["Memory / Vector Store"]
        Bot --> LLM["LLM Client (OpenAI/Anthropic/Etc)"]
        Bot --> TM["Tool Manager"]
    end

    subgraph "Tools & Capabilities"
        TM --> LocalTools["Local Tools (@tool)"]
        TM --> Toolkits["Toolkits (OpenAPI/Custom)"]
        TM --> MCPServer["External MCP Servers"]
    end

    subgraph "Connectivity"
        Bot -.-> A2A["A2A Protocol (Client/Server)"]
        Bot -.-> MCP["MCP Protocol (Server)"]
        Bot -.-> Integrations["Telegram / MS Teams"]
    end

    subgraph "Orchestration"
        Crew["AgentCrew"] --> Bot
        Crew --> OtherBots["Other Agents"]
    end

Core Concepts

Agents (Chatbot)

The Chatbot class is your main entry point. It handles conversation history, RAG (Retrieval-Augmented Generation), and the tool execution loop.

bot = Chatbot(
    name="MyAgent",
    model="anthropic:claude-3-5-sonnet-20240620",
    enable_memory=True
)

Tools

Functional Tools (@tool)

The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.

from parrot.tools import tool

@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
    """Calculate VAT for a given amount."""
    return amount * rate

Class-Based Toolkits (AbstractToolkit)

Group related tools into a reusable class. All public async methods become tools.

from parrot.tools import AbstractToolkit

class MathToolkit(AbstractToolkit):
    async def add(self, a: int, b: int) -> int:
        """Add two numbers."""
        return a + b

    async def multiply(self, a: int, b: int) -> int:
        """Multiply two numbers."""
        return a * b

OpenAPI Toolkit (OpenAPIToolkit)

Dynamically generate tools from any OpenAPI/Swagger specification.

from parrot.tools import OpenAPIToolkit

petstore = OpenAPIToolkit(
    spec="https://petstore.swagger.io/v2/swagger.json",
    service="petstore"
)

# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())

Orchestration (AgentCrew)

Orchestrate multiple agents to solve complex tasks using AgentCrew.

Supported Modes:

  • Sequential: Agents run one after another, passing context.
  • Parallel: Independent tasks run concurrently.
  • Flow: DAG-based execution defined by dependencies.
  • Loop: Iterative execution until a condition is met.
from parrot.bots.orchestration import AgentCrew

crew = AgentCrew(
    name="ResearchTeam",
    agents=[researcher_agent, writer_agent]
)

# Define a Flow — Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)

await crew.run_flow("Research the latest advancements in Quantum Computing")

Scheduling (@schedule)

Give your agents agency to run tasks in the background.

from parrot.scheduler import schedule, ScheduleType

class DailyBot(Chatbot):
    @schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
    async def morning_briefing(self):
        news = await self.ask("Summarize today's top tech news")
        await self.send_notification(news)

Connectivity & Exposure

Agent-to-Agent (A2A) Protocol

Agents can discover and talk to each other using the A2A protocol.

Expose an Agent:

from parrot.a2a import A2AServer

a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")

Consume an Agent:

from parrot.a2a import A2AClient

async with A2AClient("https://remote-agent.com") as client:
    response = await client.send_message("Hello from another agent!")

Model Context Protocol (MCP)

AI-Parrot has first-class support for MCP.

Consume MCP Servers:

mcp_servers = [
    MCPServerConfig(
        name="filesystem",
        command="npx",
        args=["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
    )
]
await bot.setup_mcp_servers(mcp_servers)

Expose Agent as MCP Server: Allow Claude Desktop or other MCP clients to use your agent as a tool.

Platform Integrations

Expose your bots natively to chat platforms:

  • Telegram
  • Microsoft Teams
  • Slack
  • WhatsApp

Optional capabilities

Dev-Loop Orchestration

Optional. Requires the [claude-agent] extra: pip install ai-parrot[claude-agent]

A 5-node AgentsFlow that fixes "small operational bugs" automatically:

BugIntake → Research → Development → QA → DeploymentHandoff
                                       │
                                       └─(qa failed / hard error)→ FailureHandler

The flow takes a Pydantic BugBrief (Jira ticket + log sources + acceptance criteria) and produces a PR plus a Jira ticket transitioned to "Ready to Deploy". Failures escalate back to the original reporter.

Prerequisites

  • Python 3.11+ with ai-parrot[claude-agent] installed.
  • claude-agent-sdk >= 0.1.68 and either ANTHROPIC_API_KEY or a configured claude CLI on PATH.
  • Redis 6+ for two-stream observability (one stream per flow run plus one per dispatch).
  • Jira service-account credentials wrapped in a parrot.auth.credentials.StaticCredentialResolver.
  • (Optional) gh CLI for PR creation. Falls back to a direct GitHub REST call (using GITHUB_TOKEN + GITHUB_REPOSITORY) when the CLI is missing.

Configuration (navconfig)

Setting Default Purpose
CLAUDE_CODE_MAX_CONCURRENT_DISPATCHES 3 Cap on concurrent Claude Code dispatches (dispatcher-side semaphore).
FLOW_MAX_CONCURRENT_RUNS 5 Cap on concurrent flow runs (orchestrator-side).
FLOW_BOT_JIRA_ACCOUNT_ID "" Jira accountId of the service-account bot. Must be set per environment.
WORKTREE_BASE_PATH .claude/worktrees Base directory for per-feature worktrees. The dispatcher refuses any cwd outside this path.
FLOW_STREAM_TTL_SECONDS 604800 Retention for both flow and dispatch Redis streams (7 days).
ACCEPTANCE_CRITERION_ALLOWLIST ["flowtask","pytest","ruff","mypy","pylint"] Allowed ShellCriterion command heads. Validated at intake.

Quickstart

from parrot.flows.dev_loop import (
    ClaudeCodeDispatcher,
    build_dev_loop_flow,
    register_pull_request_webhook,
)

dispatcher = ClaudeCodeDispatcher(
    max_concurrent=3,
    redis_url="redis://localhost:6379/0",
    stream_ttl_seconds=604800,
)
flow = build_dev_loop_flow(
    dispatcher=dispatcher,
    jira_toolkit=jira,                 # already wrapping flow-bot creds
    log_toolkits={"cloudwatch": cw, "elasticsearch": es},
    redis_url="redis://localhost:6379/0",
)
register_pull_request_webhook(orchestrator, secret=GITHUB_WEBHOOK_SECRET)
# Then run via your AutonomousOrchestrator with a BugBrief in ctx.

Live observability

The dispatcher publishes per-event DispatchEvent envelopes (queued, started, message, tool_use, tool_result, output_invalid, failed, completed) to Redis Streams. The parrot.flows.dev_loop.flow_stream_ws aiohttp handler exposes a WebSocket endpoint that fans-in the flow stream and every dispatch stream into a single envelope per event for the UI to consume — the UI never speaks Redis directly.


Supported LLM Providers

Provider Extra Identifier Example
OpenAI openai openai openai:gpt-4o
Anthropic anthropic anthropic, claude anthropic:claude-sonnet-4-20250514
Google Gemini google google google:gemini-3.1-flash-lite-preview
Groq groq groq groq:llama-3.3-70b-versatile
X.AI / Grok xai grok grok:grok-3
HuggingFace (included) hf hf:meta-llama/Llama-3-8B
vLLM (included) vllm vllm:model-name
OpenRouter (included) openrouter openrouter:anthropic/claude-sonnet-4
Ollama (included) via OpenAI endpoint

Contributing

Development setup (from source)

AI-Parrot uses uv as its package manager and provides a Makefile to simplify common tasks.

git clone https://github.com/phenobarbital/ai-parrot.git
cd ai-parrot

# Create the virtual environment (Python 3.11)
make venv
source .venv/bin/activate

# Full dev install — all packages, all extras, dev tools
make develop

# Run tests
make test

Makefile targets

The Makefile covers the entire development lifecycle. Run make help for the full list.

Development install variants:

Target What it installs
make develop All packages + all extras + dev tools (full environment)
make develop-fast All packages, base deps only (no torch/tensorflow/whisperx)
make develop-ml Embeddings + audio loaders (heavy ML stack)

Production install variants:

Target What it installs
make install All packages, base deps only (no extras)
make install-core Core with LLM clients + vector stores
make install-tools Core + tools with common extras (jira, slack, aws, etc.)
make install-tools-all Core + tools with ALL extras
make install-loaders Core + loaders with common extras (youtube, web, pdf)
make install-loaders-all Core + loaders with ALL extras (includes whisperx, pyannote)
make install-all Everything with ALL extras

Other useful targets:

make format          # Format code with black
make lint            # Lint with pylint + black --check
make test            # Run pytest + mypy
make build           # Build all packages (sdist + wheel)
make release         # Build + publish to PyPI
make lock            # Regenerate uv.lock
make clean           # Remove build artifacts
make generate-registry  # Regenerate TOOL_REGISTRY from source
make bump-patch      # Bump patch version (syncs across all packages)

Manual install (without Make)

If you prefer not to use Make:

uv venv --python 3.11 .venv
source .venv/bin/activate

# Full install
uv sync --all-packages --all-extras

# Or selective extras
uv sync --extra google --extra openai

Project layout

ai-parrot/
├── packages/
│   ├── ai-parrot/           # Core framework
│   │   └── src/parrot/
│   ├── ai-parrot-tools/     # Tool implementations
│   │   └── src/parrot_tools/
│   └── ai-parrot-loaders/   # Document loaders
│       └── src/parrot_loaders/
├── tests/
├── examples/
├── Makefile                  # Build, install, test, release shortcuts
└── pyproject.toml            # Workspace root

Releasing to PyPI

AI-Parrot publishes three packages on every GitHub release:

Package PyPI Project Build Method
ai-parrot ai-parrot cibuildwheel (Cython + Rust/Maturin)
ai-parrot-tools ai-parrot-tools uv build (pure Python)
ai-parrot-loaders ai-parrot-loaders uv build (pure Python)

The release workflow (.github/workflows/release.yml) runs 3 parallel build jobs and a single deploy job:

release event
    ├── build-core   — cibuildwheel for ai-parrot (Cython + Rust)
    ├── build-tools  — uv build for ai-parrot-tools
    ├── build-loaders — uv build for ai-parrot-loaders
    └── deploy       — twine upload all artifacts to PyPI

To create a release:

  1. Bump the version in each package's pyproject.toml (or use make bump-patch to sync all three).
  2. Create a GitHub release — the workflow triggers automatically on the release: created event.

First-time PyPI setup (required once):

  • Create ai-parrot-tools and ai-parrot-loaders projects on PyPI under the same account as ai-parrot.
  • Ensure the NAV_AIPARROT_API_SECRET GitHub secret holds a PyPI API token with upload scope for all 3 projects. A scoped token per project or a single account-level token both work.

Independent versioning:

Each package has its own version number in its pyproject.toml. All three are built and published on the same release event — there is no requirement to keep versions in sync.


Guidelines

  • All code must be async-first — no blocking I/O in async contexts
  • Use type hints and Google-style docstrings on all public APIs
  • Use Pydantic models for structured data
  • Run pytest after any logic change
  • Tools with heavy dependencies must use lazy imports to avoid bloating the core

Issues & Support


DB-Persisted Bot Configuration (FEAT-133)

Added in FEAT-133. Spec: sdd/specs/bot-reranker-and-parent-searcher-config.spec.md.

BotManager._load_database_bots reads two new JSONB columns on navigator.ai_bots and wires the resulting objects into every DB-loaded bot at startup. Both columns default to '{}'::JSONB, so existing rows are unaffected.

Reranker config (reranker_config)

Controls cross-encoder or LLM-based reranking of vector-store candidates (FEAT-126). Empty {} means no reranking.

Local cross-encoder (no live LLM call; requires sentence-transformers):

{
  "type": "local_cross_encoder",
  "model_name": "cross-encoder/ms-marco-MiniLM-L-12-v2",
  "device": "cpu",
  "rerank_oversample_factor": 4
}

LLM reranker (uses the bot's own LLM client):

{
  "type": "llm",
  "client_ref": "bot",
  "rerank_oversample_factor": 4
}

Factory: parrot.rerankers.factory.create_reranker(config, *, bot_llm_client=None)

Parent-searcher config (parent_searcher_config)

Controls parent-document expansion after vector search (FEAT-128). Empty {} means no expansion.

In-table parent search (chunk row has a parent_id FK in the same table):

{
  "type": "in_table",
  "expand_to_parent": true
}

expand_to_parent is also forwarded as a constructor kwarg so the bot's retrieval logic can branch on it before calling the searcher.

Factory: parrot.stores.parents.factory.create_parent_searcher(config, *, store)

store (bot.store) becomes available only after await bot.configure(app). The factory is therefore called after configure(), not before.

Error handling

An unknown type value raises parrot.exceptions.ConfigError at bot startup. The bot is not silently registered without its configured features.

# A row with {"type": "magic"} in reranker_config will raise:
# ConfigError: unknown reranker type 'magic'; supported: local_cross_encoder, llm

Order of operations

create_reranker(reranker_config)           # before bot construction
bot = BotClass(..., reranker=reranker, expand_to_parent=...)
await bot.configure(app)                   # store becomes available
create_parent_searcher(config, store=bot.store)
bot.parent_searcher = parent_searcher

License

MIT


Built with care by the AI-Parrot Team

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_parrot-0.24.40.tar.gz (3.1 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ai_parrot-0.24.40-cp314-cp314-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.14musllinux: musl 1.2+ x86-64

ai_parrot-0.24.40-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.40-cp313-cp313-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ x86-64

ai_parrot-0.24.40-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.40-cp312-cp312-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ x86-64

ai_parrot-0.24.40-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.40-cp311-cp311-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ x86-64

ai_parrot-0.24.40-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

File details

Details for the file ai_parrot-0.24.40.tar.gz.

File metadata

  • Download URL: ai_parrot-0.24.40.tar.gz
  • Upload date:
  • Size: 3.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ai_parrot-0.24.40.tar.gz
Algorithm Hash digest
SHA256 97277838d6a2430dbadee2571983c6f873bba361048a80550e52c2fe165acbba
MD5 c16e79dbdf2abc45757358e6511b0075
BLAKE2b-256 7fef19b128ff66308eb243d25154570aed08eb424230d90e30b4a91dac85fde6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40.tar.gz:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp314-cp314-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp314-cp314-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 553f48082475ef7e0cc1bf2a2af099d884fdc5b7d82cf2200427712a7091253c
MD5 6fc00d43b0aae3ea6d578668847762da
BLAKE2b-256 406afb3c45cdd6ac6c1614b7de6218dab17e19665c7e7208dbfafbb3b4a191f0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp314-cp314-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 90b287abd10e035b132469fda988e8736d553f994c421e708b9cf2f380f6c7e3
MD5 2ce8331bbb7aae6be29032c8e2e76e9d
BLAKE2b-256 c04ebe45db7b5bf3f4cbf363a9cdd921090f90e1dafa54a142a221cfe3844b51

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp313-cp313-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp313-cp313-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 fb5d8f0a72a132ff83be42ad93fdb84ce1e0d4b3d86a626dc57664a5d0a0669e
MD5 fa6d13057595758477e0cb2566effd1f
BLAKE2b-256 9234ff7f388ff7c9d032b7b77e3a11562354d056d09aca8ce17b2678f62d6884

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp313-cp313-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ef5354cbed13c7a0138f31069d55b0a8fcf5ae2c7ce71de6519ddf1b0f9bfbba
MD5 bcca24354b219471620e61339167d590
BLAKE2b-256 077b2e95956c6bf85d13905ca25b47b46299d3f01159704b8f0cf4c6f0d37389

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp312-cp312-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp312-cp312-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 f663da31f05d717899d13fdaed75c0e2e0feaa0be2f590d539c4a57bc39f93f2
MD5 6b9608ac89ac2a683ec622ecb1358fcb
BLAKE2b-256 c3d30de60fcf319fbc97ca3ed47ed4750c7910fbb83480bc17ea4dad6cbc46cc

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp312-cp312-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ce0218f6c5c23c576f14b708644a41f447cdaa524c17e8fc267ba09f8431268d
MD5 fdd3175a86d000b249d98edb24f5fc19
BLAKE2b-256 b888f6e79d38260fb872f09b5cf6a381351f7c5738040b236069e5ba6b9011d0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp311-cp311-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp311-cp311-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 51cad246c5204d3b0adea3ec571318ed3fcf0668b762485bc4d70431e0dba642
MD5 731da833eb4749c226660e04be2f54ba
BLAKE2b-256 58bbc9a59cd39e84fb502453f5ca1f703431256e8f994d57dba694123cfe2ad3

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp311-cp311-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.40-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.40-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2ffd66f13e60af053e24852ad82966ed492cf83232ddd880e023c694050020aa
MD5 417588e7dbe2ff3c028abaffec83f520
BLAKE2b-256 e2191b2d299f571c049a17ffad04523e2e6374b0fbe9c920e3980bab1133415d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.40-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page