No project description provided
Project description
LangGraph API (In-Memory)
This package implements a local version of the LangGraph API for rapid development and testing. Build and iterate on LangGraph agents with a tight feedback loop. The sesrver is backed by a predominently in-memory data store that is persisted to local disk when the server is restarted.
For production use, see the various deployment options for the LangGraph API, which are backed by a production-grade database.
Installation
Install the langgraph-cli
package with the inmem
extra. Your CLI version must be no lower than 0.1.55
.
pip install -U langgraph-cli[inmem]
Quickstart
-
(Optional) Clone a starter template:
langgraph new --template new-langgraph-project-python ./my-project cd my-project
(Recommended) Use a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate python -m pip install -e . python -m pip install -U langgraph-cli[inmem]
-
Start the development server:
langgraph dev --config ./langgraph.json
-
The server will launch, opening a browser window with the graph UI. Interact with your graph or make code edits; the server automatically reloads on changes.
Usage
Start the development server:
langgraph dev
Your agent's state (threads, runs, assistants) persists in memory while the server is running - perfect for development and testing. Each run's state is tracked and can be inspected, making it easy to debug and improve your agent's behavior.
How-To
Attaching a debugger
Debug mode lets you attach your IDE's debugger to the LangGraph API server to set breakpoints and step through your code line-by-line.
-
Install debugpy:
pip install debugpy
-
Start the server in debug mode:
langgraph dev --debug-port 5678
-
Configure your IDE:
- VS Code: Add this launch configuration:
{ "name": "Attach to LangGraph", "type": "debugpy", "request": "attach", "connect": { "host": "0.0.0.0", "port": 5678 }, }
- PyCharm: Use "Attach to Process" and select the langgraph process
- VS Code: Add this launch configuration:
-
Set breakpoints in your graph code and start debugging.
CLI options
langgraph dev [OPTIONS]
Options:
--debug-port INTEGER Enable remote debugging on specified port
--no-browser Skip opening browser on startup
--n-jobs-per-worker INTEGER Maximum concurrent jobs per worker process
--config PATH Custom configuration file path
--no-reload Disable code hot reloading
--port INTEGER HTTP server port (default: 8000)
--host TEXT HTTP server host (default: localhost)
License
This project is licensed under the Elastic License 2.0 - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file langgraph_api_inmem-0.0.3.tar.gz
.
File metadata
- Download URL: langgraph_api_inmem-0.0.3.tar.gz
- Upload date:
- Size: 20.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.4.29
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9ca6a4065877967028909fb1b5b9085503d7b0021067a43e663e55c865ff7e2c |
|
MD5 | 838e623d27f50981d7eac611ce3e9dfa |
|
BLAKE2b-256 | 317ab3666be06d5d51da368e64d651b408ee4f7115b2db4805f43e61ba50edec |
File details
Details for the file langgraph_api_inmem-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: langgraph_api_inmem-0.0.3-py3-none-any.whl
- Upload date:
- Size: 22.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.4.29
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 161f76dd916048c59ae2007bea3ab3e066479aeeaef3d348e815908c18af82d0 |
|
MD5 | 48bd6b2b08788e01b8d93b677e071c58 |
|
BLAKE2b-256 | be5787c57bfc3799eb20b12bd4381a346f317e164a209918a7e83b569d806a93 |