Skip to main content

Grab and deploy Haystack pipelines

Project description

Hayhooks

Hayhooks makes it easy to deploy and serve Haystack Pipelines and Agents.

With Hayhooks, you can:

  • 📦 Deploy your Haystack pipelines and agents as REST APIs with maximum flexibility and minimal boilerplate code.
  • 🛠️ Expose your Haystack pipelines and agents over the MCP protocol, making them available as tools in AI dev environments like Cursor or Claude Desktop. Under the hood, Hayhooks runs as an MCP Server, exposing each pipeline and agent as an MCP Tool.
  • 💬 Integrate your Haystack pipelines and agents with Open WebUI as OpenAI-compatible chat completion backends with streaming support.
  • 🖥️ Embed a Chainlit chat UI directly in Hayhooks with pip install "hayhooks[chainlit]" and hayhooks run --with-chainlit -- zero-configuration frontend with streaming, pipeline selection, and custom UI widgets.
  • 🕹️ Control Hayhooks core API endpoints through chat - deploy, undeploy, list, or run Haystack pipelines and agents by chatting with Claude Desktop, Cursor, or any other MCP client.
  • 📈 Trace Hayhooks lifecycle actions with OpenTelemetry (pip install "hayhooks[tracing]") for deploy/run/undeploy visibility across REST and MCP, with a /dashboard UI via hayhooks run --with-tracing-dashboard (backed by a local live trace buffer).

PyPI - Version PyPI - Python Version Docker image release Tests

Documentation

📚 For detailed guides, examples, and API reference, check out our comprehensive documentation.

Quick Start

1. Install Hayhooks

# Install Hayhooks
pip install hayhooks

2. Start Hayhooks

hayhooks run

3. Create a simple agent

Create a minimal agent wrapper with streaming chat support and a simple HTTP POST API:

from typing import AsyncGenerator
from haystack.components.agents import Agent
from haystack.dataclasses import ChatMessage
from haystack.tools import Tool
from haystack.components.generators.chat import OpenAIChatGenerator
from hayhooks import BasePipelineWrapper, async_streaming_generator


# Define a Haystack Tool that provides weather information for a given location.
def weather_function(location):
    return f"The weather in {location} is sunny."

weather_tool = Tool(
    name="weather_tool",
    description="Provides weather information for a given location.",
    parameters={
        "type": "object",
        "properties": {"location": {"type": "string"}},
        "required": ["location"],
    },
    function=weather_function,
)

class PipelineWrapper(BasePipelineWrapper):
    def setup(self) -> None:
        self.agent = Agent(
            chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
            system_prompt="You're a helpful agent",
            tools=[weather_tool],
        )

    # This will create a POST /my_agent/run endpoint
    # `question` will be the input argument and will be auto-validated by a Pydantic model
    async def run_api_async(self, question: str) -> str:
        result = await self.agent.run_async(messages=[ChatMessage.from_user(question)])
        return result["last_message"].text

    # This will create an OpenAI-compatible /chat/completions endpoint
    async def run_chat_completion_async(
        self, model: str, messages: list[dict], body: dict
    ) -> AsyncGenerator[str, None]:
        chat_messages = [
            ChatMessage.from_openai_dict_format(message) for message in messages
        ]

        return async_streaming_generator(
            pipeline=self.agent,
            pipeline_run_args={
                "messages": chat_messages,
            },
        )

Save as my_agent_dir/pipeline_wrapper.py.

4. Deploy it

hayhooks pipeline deploy-files -n my_agent ./my_agent_dir

5. Run it

Call the HTTP POST API (/my_agent/run):

curl -X POST http://localhost:1416/my_agent/run \
  -H 'Content-Type: application/json' \
  -d '{"question": "What can you do?"}'

Call the OpenAI-compatible chat completion API (streaming enabled):

curl -X POST http://localhost:1416/chat/completions \
  -H 'Content-Type: application/json' \
  -d '{
    "model": "my_agent",
    "messages": [{"role": "user", "content": "What can you do?"}]
  }'

Or chat with it in the embedded Chainlit UI (hayhooks run --with-chainlit) or integrate it with Open WebUI!

Key Features

🚀 Easy Deployment

  • Deploy Haystack pipelines and agents as REST APIs with minimal setup
  • Support for both YAML-based and wrapper-based pipeline deployment
  • Automatic OpenAI-compatible endpoint generation

🌐 Multiple Integration Options

  • MCP Protocol: Expose pipelines as MCP tools for use in AI development environments
  • Chainlit UI: Embedded chat frontend with streaming, pipeline selection, and custom UI widgets
  • Open WebUI Integration: Use Hayhooks as a backend for Open WebUI with streaming support
  • OpenAI Compatibility: Seamless integration with OpenAI-compatible tools and frameworks

🔧 Developer Friendly

  • CLI for easy pipeline management
  • Flexible configuration options
  • Comprehensive logging and debugging support
  • OpenTelemetry-ready tracing hooks built on Haystack tracing APIs
  • Custom route and middleware support

📁 File Upload Support

  • Built-in support for handling file uploads in pipelines
  • Perfect for RAG systems and document processing

Next Steps

Community & Support

Hayhooks is actively maintained by the deepset team.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hayhooks-1.19.0.tar.gz (16.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hayhooks-1.19.0-py3-none-any.whl (273.2 kB view details)

Uploaded Python 3

File details

Details for the file hayhooks-1.19.0.tar.gz.

File metadata

  • Download URL: hayhooks-1.19.0.tar.gz
  • Upload date:
  • Size: 16.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hayhooks-1.19.0.tar.gz
Algorithm Hash digest
SHA256 f3b0877c3fbe493a473ea065c4e61eaa1e1c5925a483d11a5ca427e6e4168f86
MD5 500c9368b03ab32dbef65abf6af7efd3
BLAKE2b-256 12fd70678433a511fc14bed61e6ebfa5169bf02ea1b7f55dbc79d0194e1c15ed

See more details on using hashes here.

Provenance

The following attestation bundles were made for hayhooks-1.19.0.tar.gz:

Publisher: pypi.yml on deepset-ai/hayhooks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hayhooks-1.19.0-py3-none-any.whl.

File metadata

  • Download URL: hayhooks-1.19.0-py3-none-any.whl
  • Upload date:
  • Size: 273.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hayhooks-1.19.0-py3-none-any.whl
Algorithm Hash digest
SHA256 20574aca82ea14c905f0559af36947135404db8aa47446d95f5b039d8596a635
MD5 74c4570e1af874263961fad456fef1ad
BLAKE2b-256 f961f71b50cbaa64fdc8c5a4f3289d898a41eea23d7c195cc3f17061d52b2d22

See more details on using hashes here.

Provenance

The following attestation bundles were made for hayhooks-1.19.0-py3-none-any.whl:

Publisher: pypi.yml on deepset-ai/hayhooks

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page