Skip to main content

Full toolkit for running an AI agent service built with LangGraph, FastAPI and Streamlit

Project description

LangGraph Agent Toolkit Logo

🧰 LangGraph Agent Toolkit

CI/Testing build status docs status codecov
Package PyPI version PyPI Downloads Python Version
Meta Ruff GitHub License

📋 Introduction

A comprehensive toolkit for building, deploying, and managing AI agents using LangGraph, FastAPI, and Streamlit. It provides a production-ready framework for creating conversational AI agents with features like multi-provider LLM support, streaming responses, observability, memory and prompt management.

What is langGraph-agent-toolkit?

The langgraph-agent-toolkit is a full-featured framework for developing and deploying AI agent services. Built on the foundation of:

  • LangGraph for agent creation with advanced flows and human-in-the-loop capabilities
  • FastAPI for robust, high-performance API services with streaming support
  • Streamlit for intuitive user interfaces

Key components include:

  • Data structures and settings built with Pydantic
  • LiteLLM proxy for universal multi-provider LLM support
  • Comprehensive memory management and persistence using PostgreSQL/SQLite
  • Advanced observability tooling via Langfuse and Langsmith
  • Modular architecture allowing customization while maintaining a consistent application structure

Whether you're building a simple chatbot or complex multi-agent system, this toolkit provides the infrastructure to develop, test, and deploy your LangGraph-based agents with confidence.

You can use DeepWiki to learn more about this repository.

📑 Contents

🚀 Quickstart

  1. Create a .env file based on .env.example

  2. Option 1: Run with Python from source

    # Install dependencies
    pip install uv
    uv sync --frozen
    source .venv/bin/activate
    
    # Start the service
    python langgraph_agent_toolkit/run_api.py
    
    # In another terminal
    source .venv/bin/activate
    streamlit run langgraph_agent_toolkit/run_app.py
    
  3. Option 2: Run with Python from PyPi repository

    pip install langgraph-agent-toolkit
    

    ℹ️ For more details on installation options, see the Installation Documentation.

  4. Option 3: Run with Docker

    docker compose watch
    

📦 Installation Options

The toolkit supports multiple installation options using "extras" to include just the dependencies you need.

For detailed installation instructions and available extras, see the Installation Documentation.

🏗️ Architecture

✨ Key Features

  1. LangGraph Integration

    • Latest LangGraph v0.3 features
    • Human-in-the-loop with interrupt()
    • Flow control with Command and langgraph-supervisor
  2. API Service

    • FastAPI with streaming and non-streaming endpoints
    • Support for both token-based and message-based streaming
    • Multiple agent support with URL path routing
    • Available agents and models listed at /info endpoint
    • Supports different runners (unicorn, gunicorn, mangum, azure functions)
  3. Developer Experience

    • Asynchronous design with async/await
    • Docker configuration with live reloading
    • Comprehensive testing suite
  4. Enterprise Components

    • Configurable PostgreSQL/SQLite connection pools
    • Observability via Langfuse and Langsmith
    • User feedback system
    • Prompt management system
    • LiteLLM proxy integration

For more details on features, see the Usage Documentation.

⚙️ Environment Setup

For detailed environment setup instructions, including creating your .env file and configuring LiteLLM, see the Environment Setup Documentation.

📂 Project Structure

The repository contains:

  • langgraph_agent_toolkit/agents/blueprints/: Agent definitions
  • langgraph_agent_toolkit/agents/agent_executor.py: Agent execution control
  • langgraph_agent_toolkit/schema/: Protocol schema definitions
  • langgraph_agent_toolkit/core/: Core modules (LLM, memory, settings)
  • langgraph_agent_toolkit/service/service.py: FastAPI service
  • langgraph_agent_toolkit/client/client.py: Service client
  • langgraph_agent_toolkit/run_app.py: Chat interface
  • docker/: Docker configurations
  • tests/: Test suite

🛠️ Setup and Usage

For detailed setup and usage instructions, including building your own agent, Docker setup, using the AgentClient, and local development, see the Usage Documentation.

📚 Documentation

Full documentation is available at GitHub repository and includes:

📚 Useful Resources

👥 Development and Contributing

Thank you for considering contributing to Langgraph Agent Toolkit! We encourage the community to post Issues and Pull Requests.

Before you get started, please see our Contribution Guide.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langgraph_agent_toolkit-0.8.14.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langgraph_agent_toolkit-0.8.14-py3-none-any.whl (106.5 kB view details)

Uploaded Python 3

File details

Details for the file langgraph_agent_toolkit-0.8.14.tar.gz.

File metadata

  • Download URL: langgraph_agent_toolkit-0.8.14.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for langgraph_agent_toolkit-0.8.14.tar.gz
Algorithm Hash digest
SHA256 256fc342f492173e7808f2c657f0cd6d5a5c2f4d37e273f1ac04be81cb00b453
MD5 71866a42774e1a17c2a8ca2671dbec5e
BLAKE2b-256 6c58e3748d0e153ecfcb546e09ea22352f49de92974786a496392b294b86c613

See more details on using hashes here.

File details

Details for the file langgraph_agent_toolkit-0.8.14-py3-none-any.whl.

File metadata

File hashes

Hashes for langgraph_agent_toolkit-0.8.14-py3-none-any.whl
Algorithm Hash digest
SHA256 4c3c4f491beb8dfa972b453810f9b6ef413d4084191354a38bf9a270bd9f6e0f
MD5 914c5db9bbafc20c96085f08273414b4
BLAKE2b-256 febba335e185c7026f82085cbc564da86f1a92f1b109037bc52c10797b4c67c4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page