Skip to main content

Building stateful, multi-actor applications with LLMs

Project description

This project is a branch of langgraph on QPython.

Trusted by companies shaping the future of agents – including Klarna, Replit, Elastic, and more – LangGraph is a low-level orchestration framework for building, managing, and deploying long-running, stateful agents.

Get started

Install LangGraph:

pip install -U langgraph

Then, create an agent using prebuilt components:

# pip install -qU "langchain[anthropic]" to call the model

from langgraph.prebuilt import create_react_agent

def get_weather(city: str) -> str:
    """Get weather for a given city."""
    return f"It's always sunny in {city}!"

agent = create_react_agent(
    model="anthropic:claude-3-7-sonnet-latest",
    tools=[get_weather],
    prompt="You are a helpful assistant"
)

# Run the agent
agent.invoke(
    {"messages": [{"role": "user", "content": "what is the weather in sf"}]}
)

For more information, see the Quickstart. Or, to learn how to build an agent workflow with a customizable architecture, long-term memory, and other complex task handling, see the LangGraph basics tutorials.

Core benefits

LangGraph provides low-level supporting infrastructure for any long-running, stateful workflow or agent. LangGraph does not abstract prompts or architecture, and provides the following central benefits:

  • Durable execution: Build agents that persist through failures and can run for extended periods, automatically resuming from exactly where they left off.
  • Human-in-the-loop: Seamlessly incorporate human oversight by inspecting and modifying agent state at any point during execution.
  • Comprehensive memory: Create truly stateful agents with both short-term working memory for ongoing reasoning and long-term persistent memory across sessions.
  • Debugging with LangSmith: Gain deep visibility into complex agent behavior with visualization tools that trace execution paths, capture state transitions, and provide detailed runtime metrics.
  • Production-ready deployment: Deploy sophisticated agent systems confidently with scalable infrastructure designed to handle the unique challenges of stateful, long-running workflows.

LangGraph’s ecosystem

While LangGraph can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools for building agents. To improve your LLM application development, pair LangGraph with:

  • LangSmith — Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time.
  • LangGraph Platform — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in LangGraph Studio.
  • LangChain – Provides integrations and composable components to streamline LLM application development.

[!NOTE] Looking for the JS version of LangGraph? See the JS repo and the JS docs.

Additional resources

  • Guides: Quick, actionable code snippets for topics such as streaming, adding memory & persistence, and design patterns (e.g. branching, subgraphs, etc.).
  • Reference: Detailed reference on core classes, methods, how to use the graph and checkpointing APIs, and higher-level prebuilt components.
  • Examples: Guided examples on getting started with LangGraph.
  • LangChain Academy: Learn the basics of LangGraph in our free, structured course.
  • Templates: Pre-built reference apps for common agentic workflows (e.g. ReAct agent, memory, retrieval etc.) that can be cloned and adapted.
  • Case studies: Hear how industry leaders use LangGraph to ship AI applications at scale.

Acknowledgements

LangGraph is inspired by Pregel and Apache Beam. The public interface draws inspiration from NetworkX. LangGraph is built by LangChain Inc, the creators of LangChain, but can be used without LangChain.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langgraph_aipy-0.3.18.tar.gz (165.4 kB view details)

Uploaded Source

Built Distribution

langgraph_aipy-0.3.18-py3-none-any.whl (194.6 kB view details)

Uploaded Python 3

File details

Details for the file langgraph_aipy-0.3.18.tar.gz.

File metadata

  • Download URL: langgraph_aipy-0.3.18.tar.gz
  • Upload date:
  • Size: 165.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.7

File hashes

Hashes for langgraph_aipy-0.3.18.tar.gz
Algorithm Hash digest
SHA256 927b79aec74074760257fade2e98d4c1db03376e23f3ff92a81db9f5984d1f64
MD5 67a224de1d40777361fcc1c7b7e0ae8e
BLAKE2b-256 1b94af13f3aa7eb5c61d2a323d5e13df76aa0c4b414f8a6fce9ce7a046a55d29

See more details on using hashes here.

File details

Details for the file langgraph_aipy-0.3.18-py3-none-any.whl.

File metadata

File hashes

Hashes for langgraph_aipy-0.3.18-py3-none-any.whl
Algorithm Hash digest
SHA256 fe6783a6ed221d38362979d1c902ab305032168396b2a7505ff362e3026f1a5b
MD5 f657353e9776cf2dad4a29ea18d9b4a9
BLAKE2b-256 137c1ab098b680379525a1d67955b08285b9433a182297bf9d5bbb0739813bf4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page