Skip to main content

Framework for building AI agents for Navigator

Project description

AI-Parrot

AI-Parrot is an async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Built on top of navigator-api, it provides a unified interface for interacting with various LLM providers, managing tools, conducting agent-to-agent (A2A) communication, and serving agents via the Model Context Protocol (MCP).

Whether you need a simple chatbot, a complex multi-agent orchestration workflow, or a robust production-ready AI service, AI-Parrot exposes the primitives to build it efficiently.

Monorepo Structure

AI-Parrot is organized as a monorepo with four packages:

Package PyPI Name Description
packages/ai-parrot ai-parrot Core framework: agents, clients, memory, orchestration
packages/ai-parrot-tools ai-parrot-tools Tool and toolkit implementations (Jira, AWS, Slack, etc.)
packages/ai-parrot-loaders ai-parrot-loaders Document loaders for RAG pipelines (PDF, YouTube, audio, etc.)
packages/ai-parrot-pipelines ai-parrot-pipelines Specialized pipelines such as planogram compliance workflows

The core package (ai-parrot) provides the base abstractions (AbstractTool, AbstractToolkit, @tool) and lightweight built-in tools. Heavy tool implementations, document loaders, and specialized pipelines are split into their own packages so you only install what you need.


Installation

Core framework

uv pip install ai-parrot

Quick Setup (CLI)

After installing, use the parrot CLI to configure your environment interactively:

# Interactive setup wizard — select LLM provider, enter API keys, generate .env
parrot setup

# Initialize configuration directory structure (env/ and etc/)
parrot conf init

The parrot setup wizard will guide you through:

  1. Selecting an LLM provider (OpenAI, Anthropic, Google, etc.)
  2. Entering your API credentials
  3. Writing them to the correct .env file
  4. Optionally creating a starter Agent and bootstrap files (app.py, run.py)

Additional CLI commands:

# Start an MCP server from a YAML config
parrot mcp --config server.yaml

# Deploy an autonomous agent as a systemd service
parrot autonomous create --agent my_agent.py
parrot autonomous install --agent my_agent.py --name my-agent

LLM Providers

Install only the providers you need:

# Google Gemini
uv pip install "ai-parrot[google]"

# OpenAI / GPT
uv pip install "ai-parrot[openai]"

# Anthropic / Claude (HTTP API client)
uv pip install "ai-parrot[anthropic]"

# Claude Code agent dispatch (bundled `claude` CLI subprocess)
uv pip install "ai-parrot[claude-agent]"

# Groq
uv pip install "ai-parrot[groq]"

# X.AI / Grok
uv pip install "ai-parrot[xai]"

# All LLM providers at once
uv pip install "ai-parrot[llms]"

Additional providers supported out of the box (no extra install needed):

  • HuggingFace (hf) — uses the HuggingFace Inference API
  • vLLM (vllm) — connects to a local vLLM server
  • OpenRouter (openrouter) — routes to any model via OpenRouter API
  • Ollama / Local — via OpenAI-compatible endpoints

Anthropic: API client vs. Claude Code agent dispatch

Anthropic ships in two independent extras — pick the one(s) you need:

Extra Installs Use case
ai-parrot[anthropic] anthropic[aiohttp]>=0.97.0 API client (AnthropicClient) — completion, vision, streaming, the Messages Batches API. Talks HTTP to api.anthropic.com.
ai-parrot[claude-agent] claude-agent-sdk>=0.1.68 (which bundles the claude CLI) Agent dispatch (ClaudeAgentClient) — delegates a task to a Claude Code sub-agent that can read files, run bash, call tools. Talks to a subprocess CLI.

The two extras are independent. Install only what you use:

# I just want to call the Anthropic API:
uv pip install "ai-parrot[anthropic]"

# I want to dispatch tasks to a Claude Code agent:
uv pip install "ai-parrot[claude-agent]"

# I want both:
uv pip install "ai-parrot[anthropic,claude-agent]"

After installing [claude-agent], register/authenticate the bundled CLI once — either run claude auth interactively, or export ANTHROPIC_API_KEY in the environment. The CLI honours either path.

A runnable demo lives in examples/clients/claude_agent_example.py.

Embeddings & Vector Stores

# Sentence transformers, FAISS, ChromaDB, etc.
uv pip install "ai-parrot[embeddings]"

Tools

# Install the tools package
uv pip install ai-parrot-tools

# Or with specific tool extras
uv pip install "ai-parrot-tools[jira]"
uv pip install "ai-parrot-tools[aws]"
uv pip install "ai-parrot-tools[slack]"
uv pip install "ai-parrot-tools[finance]"
uv pip install "ai-parrot-tools[all]"       # All tool dependencies

Available tool extras: jira, slack, aws, docker, git, analysis, excel, sandbox, codeinterpreter, pulumi, sitesearch, office365, scraping, finance, db, flowtask, google, arxiv, wikipedia, weather, messaging.

Document Loaders

# Install the loaders package
uv pip install ai-parrot-loaders

# Or with specific loader extras
uv pip install "ai-parrot-loaders[youtube]"
uv pip install "ai-parrot-loaders[pdf]"
uv pip install "ai-parrot-loaders[audio]"
uv pip install "ai-parrot-loaders[all]"     # All loader dependencies

Available loader extras: youtube, audio, pdf, web, ebook, video.

Pipelines

# Install the pipelines package
uv pip install ai-parrot-pipelines

Backward-compatible imports from parrot.pipelines continue to work when the package is installed.

Platform & Security Tools

AI-Parrot includes tools for cloud security auditing and infrastructure management. These tools rely on external Docker images that must be installed before use:

# Security tools
parrot install cloudsploit    # AWS security scanner (CloudSploit)
parrot install prowler        # Cloud security posture management

# Platform tools
parrot install pulumi         # Infrastructure as Code CLI

The parrot install command pulls and configures the required Docker containers automatically, so the tools are ready to be used by your agents.


Quick Start

Create a simple weather chatbot in just a few lines of code:

import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool

# 1. Define a tool
@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is Sunny, 25C"

async def main():
    # 2. Create the Agent
    bot = Chatbot(
        name="WeatherBot",
        llm="openai:gpt-4o",  # Provider:Model
        tools=[get_weather],
        system_prompt="You are a helpful weather assistant."
    )

    # 3. Configure (loads tools, connects to memory)
    await bot.configure()

    # 4. Chat!
    response = await bot.ask("What's the weather like in Madrid?")
    print(response)

if __name__ == "__main__":
    asyncio.run(main())

Using LLM Clients Directly

Beyond the Chatbot abstraction, you can access any LLM provider client directly for lower-level operations like image generation, embeddings, or custom completion calls:

import asyncio
from parrot.clients.google.client import GoogleGenAIClient
from parrot.models.outputs import ImageGenerationPrompt
from parrot.models.google import GoogleModel

async def main():
    prompt = ImageGenerationPrompt(
        prompt="A realistic passport-style photo with white background",
        styles=["photorealistic", "high resolution"],
        model=GoogleModel.IMAGEN_3.value,
        aspect_ratio="16:9",
    )

    client = GoogleGenAIClient()
    async with client:
        response = await client.image_generation(prompt_data=prompt)
        for img_path in response.images:
            print(f"Image saved to: {img_path}")

if __name__ == "__main__":
    asyncio.run(main())

Each provider client (GoogleGenAIClient, OpenAIClient, AnthropicClient, etc.) implements AbstractClient and can be used as an async context manager. This gives you full access to provider-specific features — image generation, audio transcription, structured outputs — while still benefiting from AI-Parrot's unified configuration and credential management.


Running as a Server

AI-Parrot is not only a library — it is also a full aiohttp-based application server that exposes your agents as REST APIs, WebSocket endpoints, and more. This is powered by Navigator, an async web framework built on aiohttp.

How it works

When you run parrot setup, it generates two files:

  • app.py — Defines your application handler, registers agents with BotManager, and configures routes.
  • run.py — The entry point that starts the aiohttp server.

app.py (generated by parrot setup):

from parrot.manager import BotManager
from parrot.conf import STATIC_DIR
from parrot.handlers import AppHandler
from agents.my_agent import MyAgent


class Main(AppHandler):
    app_name: str = "Parrot"
    enable_static: bool = True
    staticdir: str = STATIC_DIR

    def configure(self) -> None:
        self.bot_manager = BotManager()
        self.bot_manager.register(MyAgent())
        self.bot_manager.setup(self.app)

run.py (generated by parrot setup):

from navigator import Application
from app import Main

app = Application(Main, enable_jinja2=True)

if __name__ == "__main__":
    app.run()

Built-in endpoints

Once the server starts, BotManager.setup() automatically registers these routes:

Endpoint Method Description
/api/v1/agents/chat/{agent_id} POST Chat with an agent (JSON, HTML, or Markdown response)
/api/v1/agents/chat/{agent_id} PATCH Configure tools/MCP servers for a session
/api/v1/bot_management GET List registered bots
/api/v1/bot_management/{bot} GET/POST/PATCH/DELETE CRUD operations on bots
/api/v1/agent_tools GET List available tools
/api/v1/ai/client GET LLM provider configuration
/ws/userinfo WebSocket Real-time user notifications

Starting the server

Development (single process, auto-reload):

python run.py

The server starts on http://0.0.0.0:5000 by default (configurable via APP_HOST / APP_PORT environment variables).

Production (Gunicorn with async workers):

# Install gunicorn
uv pip install "ai-parrot[deploy]"

# Run with aiohttp-compatible workers
gunicorn run:app \
    --worker-class aiohttp.worker.GunicornUVLoopWebWorker \
    --workers 4 \
    --bind 0.0.0.0:5000 \
    --timeout 360

The long timeout (360s) accommodates agent queries that involve multi-step tool execution or LLM calls.

Talking to your agents via REST

Once the server is running, any registered agent is accessible via HTTP:

# Chat with an agent
curl -X POST http://localhost:5000/api/v1/agents/chat/my-agent \
  -H "Content-Type: application/json" \
  -d '{"message": "What is the weather in Madrid?"}'

# Request markdown output
curl -X POST "http://localhost:5000/api/v1/agents/chat/my-agent?output_format=markdown" \
  -H "Content-Type: application/json" \
  -d '{"message": "Summarize the latest news"}'

Architecture

AI-Parrot is designed with a modular architecture enabling agents to be both consumers and providers of tools and services.

graph TD
    User["User / Client"] --> API["AgentTalk Handlers"]
    API --> Bot["Chatbot / BaseBot"]

    subgraph "Agent Core"
        Bot --> Memory["Memory / Vector Store"]
        Bot --> LLM["LLM Client (OpenAI/Anthropic/Etc)"]
        Bot --> TM["Tool Manager"]
    end

    subgraph "Tools & Capabilities"
        TM --> LocalTools["Local Tools (@tool)"]
        TM --> Toolkits["Toolkits (OpenAPI/Custom)"]
        TM --> MCPServer["External MCP Servers"]
    end

    subgraph "Connectivity"
        Bot -.-> A2A["A2A Protocol (Client/Server)"]
        Bot -.-> MCP["MCP Protocol (Server)"]
        Bot -.-> Integrations["Telegram / MS Teams"]
    end

    subgraph "Orchestration"
        Crew["AgentCrew"] --> Bot
        Crew --> OtherBots["Other Agents"]
    end

Core Concepts

Agents (Chatbot)

The Chatbot class is your main entry point. It handles conversation history, RAG (Retrieval-Augmented Generation), and the tool execution loop.

bot = Chatbot(
    name="MyAgent",
    model="anthropic:claude-3-5-sonnet-20240620",
    enable_memory=True
)

Tools

Functional Tools (@tool)

The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.

from parrot.tools import tool

@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
    """Calculate VAT for a given amount."""
    return amount * rate

Class-Based Toolkits (AbstractToolkit)

Group related tools into a reusable class. All public async methods become tools.

from parrot.tools import AbstractToolkit

class MathToolkit(AbstractToolkit):
    async def add(self, a: int, b: int) -> int:
        """Add two numbers."""
        return a + b

    async def multiply(self, a: int, b: int) -> int:
        """Multiply two numbers."""
        return a * b

OpenAPI Toolkit (OpenAPIToolkit)

Dynamically generate tools from any OpenAPI/Swagger specification.

from parrot.tools import OpenAPIToolkit

petstore = OpenAPIToolkit(
    spec="https://petstore.swagger.io/v2/swagger.json",
    service="petstore"
)

# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())

Orchestration (AgentCrew)

Orchestrate multiple agents to solve complex tasks using AgentCrew.

Supported Modes:

  • Sequential: Agents run one after another, passing context.
  • Parallel: Independent tasks run concurrently.
  • Flow: DAG-based execution defined by dependencies.
  • Loop: Iterative execution until a condition is met.
from parrot.bots.orchestration import AgentCrew

crew = AgentCrew(
    name="ResearchTeam",
    agents=[researcher_agent, writer_agent]
)

# Define a Flow — Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)

await crew.run_flow("Research the latest advancements in Quantum Computing")

Scheduling (@schedule)

Give your agents agency to run tasks in the background.

from parrot.scheduler import schedule, ScheduleType

class DailyBot(Chatbot):
    @schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
    async def morning_briefing(self):
        news = await self.ask("Summarize today's top tech news")
        await self.send_notification(news)

Connectivity & Exposure

Agent-to-Agent (A2A) Protocol

Agents can discover and talk to each other using the A2A protocol.

Expose an Agent:

from parrot.a2a import A2AServer

a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")

Consume an Agent:

from parrot.a2a import A2AClient

async with A2AClient("https://remote-agent.com") as client:
    response = await client.send_message("Hello from another agent!")

Model Context Protocol (MCP)

AI-Parrot has first-class support for MCP.

Consume MCP Servers:

mcp_servers = [
    MCPServerConfig(
        name="filesystem",
        command="npx",
        args=["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
    )
]
await bot.setup_mcp_servers(mcp_servers)

Expose Agent as MCP Server: Allow Claude Desktop or other MCP clients to use your agent as a tool.

Platform Integrations

Expose your bots natively to chat platforms:

  • Telegram
  • Microsoft Teams
  • Slack
  • WhatsApp

Optional capabilities

Dev-Loop Orchestration

Optional. Requires the [claude-agent] extra: pip install ai-parrot[claude-agent]

A 5-node AgentsFlow that fixes "small operational bugs" automatically:

BugIntake → Research → Development → QA → DeploymentHandoff
                                       │
                                       └─(qa failed / hard error)→ FailureHandler

The flow takes a Pydantic BugBrief (Jira ticket + log sources + acceptance criteria) and produces a PR plus a Jira ticket transitioned to "Ready to Deploy". Failures escalate back to the original reporter.

Prerequisites

  • Python 3.11+ with ai-parrot[claude-agent] installed.
  • claude-agent-sdk >= 0.1.68 and either ANTHROPIC_API_KEY or a configured claude CLI on PATH.
  • Redis 6+ for two-stream observability (one stream per flow run plus one per dispatch).
  • Jira service-account credentials wrapped in a parrot.auth.credentials.StaticCredentialResolver.
  • (Optional) gh CLI for PR creation. Falls back to a direct GitHub REST call (using GITHUB_TOKEN + GITHUB_REPOSITORY) when the CLI is missing.

Configuration (navconfig)

Setting Default Purpose
CLAUDE_CODE_MAX_CONCURRENT_DISPATCHES 3 Cap on concurrent Claude Code dispatches (dispatcher-side semaphore).
FLOW_MAX_CONCURRENT_RUNS 5 Cap on concurrent flow runs (orchestrator-side).
FLOW_BOT_JIRA_ACCOUNT_ID "" Jira accountId of the service-account bot. Must be set per environment.
WORKTREE_BASE_PATH .claude/worktrees Base directory for per-feature worktrees. The dispatcher refuses any cwd outside this path.
FLOW_STREAM_TTL_SECONDS 604800 Retention for both flow and dispatch Redis streams (7 days).
ACCEPTANCE_CRITERION_ALLOWLIST ["flowtask","pytest","ruff","mypy","pylint"] Allowed ShellCriterion command heads. Validated at intake.

Quickstart

from parrot.flows.dev_loop import (
    ClaudeCodeDispatcher,
    build_dev_loop_flow,
    register_pull_request_webhook,
)

dispatcher = ClaudeCodeDispatcher(
    max_concurrent=3,
    redis_url="redis://localhost:6379/0",
    stream_ttl_seconds=604800,
)
flow = build_dev_loop_flow(
    dispatcher=dispatcher,
    jira_toolkit=jira,                 # already wrapping flow-bot creds
    log_toolkits={"cloudwatch": cw, "elasticsearch": es},
    redis_url="redis://localhost:6379/0",
)
register_pull_request_webhook(orchestrator, secret=GITHUB_WEBHOOK_SECRET)
# Then run via your AutonomousOrchestrator with a BugBrief in ctx.

Live observability

The dispatcher publishes per-event DispatchEvent envelopes (queued, started, message, tool_use, tool_result, output_invalid, failed, completed) to Redis Streams. The parrot.flows.dev_loop.flow_stream_ws aiohttp handler exposes a WebSocket endpoint that fans-in the flow stream and every dispatch stream into a single envelope per event for the UI to consume — the UI never speaks Redis directly.


Supported LLM Providers

Provider Extra Identifier Example
OpenAI openai openai openai:gpt-4o
Anthropic anthropic anthropic, claude anthropic:claude-sonnet-4-20250514
Google Gemini google google google:gemini-2.0-flash
Groq groq groq groq:llama-3.3-70b-versatile
X.AI / Grok xai grok grok:grok-3
HuggingFace (included) hf hf:meta-llama/Llama-3-8B
vLLM (included) vllm vllm:model-name
OpenRouter (included) openrouter openrouter:anthropic/claude-sonnet-4
Ollama (included) via OpenAI endpoint

Contributing

Development setup (from source)

AI-Parrot uses uv as its package manager and provides a Makefile to simplify common tasks.

git clone https://github.com/phenobarbital/ai-parrot.git
cd ai-parrot

# Create the virtual environment (Python 3.11)
make venv
source .venv/bin/activate

# Full dev install — all packages, all extras, dev tools
make develop

# Run tests
make test

Makefile targets

The Makefile covers the entire development lifecycle. Run make help for the full list.

Development install variants:

Target What it installs
make develop All packages + all extras + dev tools (full environment)
make develop-fast All packages, base deps only (no torch/tensorflow/whisperx)
make develop-ml Embeddings + audio loaders (heavy ML stack)

Production install variants:

Target What it installs
make install All packages, base deps only (no extras)
make install-core Core with LLM clients + vector stores
make install-tools Core + tools with common extras (jira, slack, aws, etc.)
make install-tools-all Core + tools with ALL extras
make install-loaders Core + loaders with common extras (youtube, web, pdf)
make install-loaders-all Core + loaders with ALL extras (includes whisperx, pyannote)
make install-all Everything with ALL extras

Other useful targets:

make format          # Format code with black
make lint            # Lint with pylint + black --check
make test            # Run pytest + mypy
make build           # Build all packages (sdist + wheel)
make release         # Build + publish to PyPI
make lock            # Regenerate uv.lock
make clean           # Remove build artifacts
make generate-registry  # Regenerate TOOL_REGISTRY from source
make bump-patch      # Bump patch version (syncs across all packages)

Manual install (without Make)

If you prefer not to use Make:

uv venv --python 3.11 .venv
source .venv/bin/activate

# Full install
uv sync --all-packages --all-extras

# Or selective extras
uv sync --extra google --extra openai

Project layout

ai-parrot/
├── packages/
│   ├── ai-parrot/           # Core framework
│   │   └── src/parrot/
│   ├── ai-parrot-tools/     # Tool implementations
│   │   └── src/parrot_tools/
│   └── ai-parrot-loaders/   # Document loaders
│       └── src/parrot_loaders/
├── tests/
├── examples/
├── Makefile                  # Build, install, test, release shortcuts
└── pyproject.toml            # Workspace root

Releasing to PyPI

AI-Parrot publishes three packages on every GitHub release:

Package PyPI Project Build Method
ai-parrot ai-parrot cibuildwheel (Cython + Rust/Maturin)
ai-parrot-tools ai-parrot-tools uv build (pure Python)
ai-parrot-loaders ai-parrot-loaders uv build (pure Python)

The release workflow (.github/workflows/release.yml) runs 3 parallel build jobs and a single deploy job:

release event
    ├── build-core   — cibuildwheel for ai-parrot (Cython + Rust)
    ├── build-tools  — uv build for ai-parrot-tools
    ├── build-loaders — uv build for ai-parrot-loaders
    └── deploy       — twine upload all artifacts to PyPI

To create a release:

  1. Bump the version in each package's pyproject.toml (or use make bump-patch to sync all three).
  2. Create a GitHub release — the workflow triggers automatically on the release: created event.

First-time PyPI setup (required once):

  • Create ai-parrot-tools and ai-parrot-loaders projects on PyPI under the same account as ai-parrot.
  • Ensure the NAV_AIPARROT_API_SECRET GitHub secret holds a PyPI API token with upload scope for all 3 projects. A scoped token per project or a single account-level token both work.

Independent versioning:

Each package has its own version number in its pyproject.toml. All three are built and published on the same release event — there is no requirement to keep versions in sync.


Guidelines

  • All code must be async-first — no blocking I/O in async contexts
  • Use type hints and Google-style docstrings on all public APIs
  • Use Pydantic models for structured data
  • Run pytest after any logic change
  • Tools with heavy dependencies must use lazy imports to avoid bloating the core

Issues & Support


DB-Persisted Bot Configuration (FEAT-133)

Added in FEAT-133. Spec: sdd/specs/bot-reranker-and-parent-searcher-config.spec.md.

BotManager._load_database_bots reads two new JSONB columns on navigator.ai_bots and wires the resulting objects into every DB-loaded bot at startup. Both columns default to '{}'::JSONB, so existing rows are unaffected.

Reranker config (reranker_config)

Controls cross-encoder or LLM-based reranking of vector-store candidates (FEAT-126). Empty {} means no reranking.

Local cross-encoder (no live LLM call; requires sentence-transformers):

{
  "type": "local_cross_encoder",
  "model_name": "cross-encoder/ms-marco-MiniLM-L-12-v2",
  "device": "cpu",
  "rerank_oversample_factor": 4
}

LLM reranker (uses the bot's own LLM client):

{
  "type": "llm",
  "client_ref": "bot",
  "rerank_oversample_factor": 4
}

Factory: parrot.rerankers.factory.create_reranker(config, *, bot_llm_client=None)

Parent-searcher config (parent_searcher_config)

Controls parent-document expansion after vector search (FEAT-128). Empty {} means no expansion.

In-table parent search (chunk row has a parent_id FK in the same table):

{
  "type": "in_table",
  "expand_to_parent": true
}

expand_to_parent is also forwarded as a constructor kwarg so the bot's retrieval logic can branch on it before calling the searcher.

Factory: parrot.stores.parents.factory.create_parent_searcher(config, *, store)

store (bot.store) becomes available only after await bot.configure(app). The factory is therefore called after configure(), not before.

Error handling

An unknown type value raises parrot.exceptions.ConfigError at bot startup. The bot is not silently registered without its configured features.

# A row with {"type": "magic"} in reranker_config will raise:
# ConfigError: unknown reranker type 'magic'; supported: local_cross_encoder, llm

Order of operations

create_reranker(reranker_config)           # before bot construction
bot = BotClass(..., reranker=reranker, expand_to_parent=...)
await bot.configure(app)                   # store becomes available
create_parent_searcher(config, store=bot.store)
bot.parent_searcher = parent_searcher

License

MIT


Built with care by the AI-Parrot Team

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_parrot-0.24.38.tar.gz (3.1 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ai_parrot-0.24.38-cp314-cp314-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.14musllinux: musl 1.2+ x86-64

ai_parrot-0.24.38-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.38-cp313-cp313-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ x86-64

ai_parrot-0.24.38-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.38-cp312-cp312-musllinux_1_2_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ x86-64

ai_parrot-0.24.38-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

ai_parrot-0.24.38-cp311-cp311-musllinux_1_2_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ x86-64

ai_parrot-0.24.38-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64manylinux: glibc 2.28+ x86-64

File details

Details for the file ai_parrot-0.24.38.tar.gz.

File metadata

  • Download URL: ai_parrot-0.24.38.tar.gz
  • Upload date:
  • Size: 3.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ai_parrot-0.24.38.tar.gz
Algorithm Hash digest
SHA256 aa6d8d406342e6e830948ad31dcd61a1da76978069dd2c55a8e3eb59bd9ddde8
MD5 9987d4efdedf615020d3387c70a7346c
BLAKE2b-256 7bccd54e9078b471bc053c42a2ff21ed2df1c3756a836a28f8b8ff8fb0e775b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38.tar.gz:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp314-cp314-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp314-cp314-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 afc591f6d7e6241334ca0c84ccf7d9d1d24a9638047612652e2b0654e1ebdc7d
MD5 2706c5ff453ac4dc0034e7e2ff8e0e6f
BLAKE2b-256 018afb7dd83b9bb021af7e292826346901d98028c97b869d2b2641e67b9364b1

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp314-cp314-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ef27ec0673fab9f2c06464636bf9434090b01acf294ab073253783a8871f2062
MD5 a98151cce788242b6f3f960c850d8aec
BLAKE2b-256 bf00d9fba4222517d5361133c17ea1a0282083e8c22a553b841f1bea408aacbc

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp313-cp313-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp313-cp313-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 7dd790746acd357640db93e58f2b810e5b60c1d03c11413182aa438bdbead447
MD5 e9d9e7890c2ad42d3b08919faeb5b000
BLAKE2b-256 42364ce0c7e9547a2806540422cafd4f2883909c80507fd41bb98d48a1455376

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp313-cp313-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3cbafa2cdee8794bdd3ee7fec7705bbb941cb8718531646f82508663da1b4257
MD5 b34314f8ccf8dc2ba87ba42779a56445
BLAKE2b-256 167fe55f3c672e17fb8ac761c9936842c832a0a8d44d6d891ebe21086385f85f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp312-cp312-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp312-cp312-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 013e707b4ddc7a4868a8095b920a87caed73129d69abb68fbc56b4321bb4bded
MD5 28e118a95a1476de2d0396138c0146f1
BLAKE2b-256 51d6cf121800aa3594479038adc244b1fb0b7a4f50859390fb85267620a2ca4e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp312-cp312-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 befbf380aea997523b1da6ff756c0e9e7ab135ca5e6486727fc0e4ddc28f65e8
MD5 e0dab6d8565eb8d717128ec60f5dd6d4
BLAKE2b-256 b0ea92b468e935eee88ac4763058bf652dc21cb9697ff93c18545fb85248f12e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp311-cp311-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp311-cp311-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 9851aeec3b24c150ec64509fb718eaf0f15bb072466dfd29781489570bc9aaa6
MD5 4b5648dd0278a9d01e0222d2f3e124aa
BLAKE2b-256 81e0d7fcdca610bb0b2928c6fe21d009f7577f5e2b94c1228ad9d86a48fd38cc

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp311-cp311-musllinux_1_2_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ai_parrot-0.24.38-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ai_parrot-0.24.38-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8d71fdac865d81c4b9dc9986b5388a195b6b596d22121c6a0147843249cf2203
MD5 758da72e71e5c38117c6bdb136029e11
BLAKE2b-256 385449beea381f0a8959c65265ad808ff95f1144e5c9a31c59b7df0da30511fd

See more details on using hashes here.

Provenance

The following attestation bundles were made for ai_parrot-0.24.38-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl:

Publisher: release.yml on phenobarbital/ai-parrot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page