No project description provided
Project description
LangGraph API (In-Memory)
This package implements a local version of the LangGraph API for rapid development and testing. Build and iterate on LangGraph agents with a tight feedback loop. The sesrver is backed by a predominently in-memory data store that is persisted to local disk when the server is restarted.
For production use, see the various deployment options for the LangGraph API, which are backed by a production-grade database.
Installation
Install the langgraph-cli
package with the inmem
extra. Your CLI version must be no lower than 0.1.55
.
pip install -U langgraph-cli[inmem]
Quickstart
-
(Optional) Clone a starter template:
langgraph new --template new-langgraph-project-python ./my-project cd my-project
(Recommended) Use a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate python -m pip install -e . python -m pip install -U langgraph-cli[inmem]
-
Start the development server:
langgraph dev --config ./langgraph.json
-
The server will launch, opening a browser window with the graph UI. Interact with your graph or make code edits; the server automatically reloads on changes.
Usage
Start the development server:
langgraph dev
Your agent's state (threads, runs, assistants) persists in memory while the server is running - perfect for development and testing. Each run's state is tracked and can be inspected, making it easy to debug and improve your agent's behavior.
How-To
Attaching a debugger
Debug mode lets you attach your IDE's debugger to the LangGraph API server to set breakpoints and step through your code line-by-line.
-
Install debugpy:
pip install debugpy
-
Start the server in debug mode:
langgraph dev --debug-port 5678
-
Configure your IDE:
- VS Code: Add this launch configuration:
{ "name": "Attach to LangGraph", "type": "debugpy", "request": "attach", "connect": { "host": "0.0.0.0", "port": 5678 }, }
- PyCharm: Use "Attach to Process" and select the langgraph process
- VS Code: Add this launch configuration:
-
Set breakpoints in your graph code and start debugging.
CLI options
langgraph dev [OPTIONS]
Options:
--debug-port INTEGER Enable remote debugging on specified port
--no-browser Skip opening browser on startup
--n-jobs-per-worker INTEGER Maximum concurrent jobs per worker process
--config PATH Custom configuration file path
--no-reload Disable code hot reloading
--port INTEGER HTTP server port (default: 8000)
--host TEXT HTTP server host (default: localhost)
License
This project is licensed under the Elastic License 2.0 - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file langgraph_api_inmem-0.0.4.tar.gz
.
File metadata
- Download URL: langgraph_api_inmem-0.0.4.tar.gz
- Upload date:
- Size: 20.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.4.29
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 390f16e6cb08b81c4675f751fad8bb9c8d351ed2c6650c54ed1b34515323c1d9 |
|
MD5 | fd3988bc58bbc81f3287e7d36d84f994 |
|
BLAKE2b-256 | 273bbc321c8bc4875e032e8d2b6b77040570a1a4ef9f4b14499bdaf3e72b25f0 |
File details
Details for the file langgraph_api_inmem-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: langgraph_api_inmem-0.0.4-py3-none-any.whl
- Upload date:
- Size: 22.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.4.29
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 33c7c5e312f1191f3cb5ed1e300d8c7301c33fb0cec413e7553af698300a97c6 |
|
MD5 | 4276d034e3f72a011aa017aa16159856 |
|
BLAKE2b-256 | b8fd1dfc579b87eed9636fb0b554f0b33c30dfaff56585e840e5e02fcb9c3118 |