A powerful integration between LangGraph and Cognee providing intelligent knowledge management and retrieval capabilities for AI agents
Project description
Cognee-Integration-LangGraph
A powerful integration between Cognee and LangGraph that provides intelligent knowledge management and retrieval capabilities for AI agents.
Note: This package requires Python 3.10+ and uses async tools. All agents must use
await agent.ainvoke()instead ofagent.invoke()
Overview
cognee-integration-langgraph combines Cognee's advanced knowledge storage and retrieval system with LangGraph's workflow orchestration capabilities. This integration allows you to build AI agents that can efficiently store, search, and retrieve information from a persistent knowledge base.
Features
- Smart Knowledge Storage: Add and persist information using Cognee's advanced indexing
- Semantic Search: Retrieve relevant information using natural language queries
- Session Management: Support for user-specific data isolation
- LangGraph Integration: Seamless integration with LangGraph's agent framework
- Async Support: Built with async/await for high-performance applications
Installation
# Basic installation
pip install cognee-integration-langgraph
# With guide dependencies (needed for examples/guide.ipynb)
pip install cognee-integration-langgraph[guide]
The [guide] extra includes additional dependencies (mediawikiapi, wikibase-rest-api-client) needed for the WikiData functionality demonstrated in the guide notebook.
Quick Start
import asyncio
from langchain.agents import create_agent
from langchain_core.messages import HumanMessage
from cognee_integration_langgraph import get_sessionized_cognee_tools
import cognee
async def main():
# Get sessionized tools with a custom session ID
add_tool, search_tool = get_sessionized_cognee_tools("user-123")
# Or get regular tools without sessionization (auto-generates a session ID)
# add_tool, search_tool = get_sessionized_cognee_tools()
# Create an agent with memory capabilities
agent = create_agent(
"openai:gpt-4o-mini",
tools=[add_tool, search_tool],
)
# Use the agent (note: must use await with .ainvoke())
response = await agent.ainvoke({
"messages": [
HumanMessage(content="Remember: I like pizza and coding in Python")
]
})
print(response["messages"][-1].content)
if __name__ == "__main__":
asyncio.run(main())
Available Tools
get_sessionized_cognee_tools(session_id: Optional[str] = None)
Returns cognee tools with optional user-specific sessionization.
Parameters:
session_id(optional): User identifier for data isolation. If not provided, a random session ID is auto-generated.
Returns: (add_tool, search_tool) - A tuple of tools for storing and searching data
Usage:
# With sessionization (recommended for multi-user apps)
add_tool, search_tool = get_sessionized_cognee_tools("user-123")
# Without explicit session (auto-generates session ID)
add_tool, search_tool = get_sessionized_cognee_tools()
Individual Tools
add_tool: Store information in the knowledge basesearch_tool: Search and retrieve previously stored information
Session Management
cognee-integration-langgraph supports user-specific sessions to isolate data between different users or contexts:
import asyncio
from cognee_integration_langgraph import get_sessionized_cognee_tools
from langchain.agents import create_agent
async def main():
# Each user gets their own isolated session
user1_add, user1_search = get_sessionized_cognee_tools("user-123")
user2_add, user2_search = get_sessionized_cognee_tools("user-456")
# Create separate agents for each user
agent1 = create_agent("openai:gpt-4o-mini", tools=[user1_add, user1_search])
agent2 = create_agent("openai:gpt-4o-mini", tools=[user2_add, user2_search])
# Each agent works with isolated data
await agent1.ainvoke({"messages": [...]})
await agent2.ainvoke({"messages": [...]})
Configuration
Copy the .env.template file to .env and fill out the required API keys:
cp .env.template .env
Then edit the .env file and set both keys using your OpenAI API key:
OPENAI_API_KEY=your-openai-api-key-here
LLM_API_KEY=your-openai-api-key-here
Examples
Check out the examples/ directory for more comprehensive usage examples:
examples/example.py: Complete workflow with contract managementexamples/guide.ipynb: Jupyter notebook tutorial with step-by-step guidance
Requirements
- Python 3.10+
- OpenAI API key
- Dependencies automatically managed via pyproject.toml
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cognee_integration_langgraph-0.1.7.tar.gz.
File metadata
- Download URL: cognee_integration_langgraph-0.1.7.tar.gz
- Upload date:
- Size: 6.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.13.5 Darwin/23.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ac625c487d62954f3b8cde5308aa80fc05d5f06498674bb053068d4012a5e0c3
|
|
| MD5 |
29f2ff3e2f42cdeb477bdacab4e6fde6
|
|
| BLAKE2b-256 |
61ae602f1e542b4439c19d95d6357ba8f25f6b548c2669b618d92b6c096d22a1
|
File details
Details for the file cognee_integration_langgraph-0.1.7-py3-none-any.whl.
File metadata
- Download URL: cognee_integration_langgraph-0.1.7-py3-none-any.whl
- Upload date:
- Size: 6.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.13.5 Darwin/23.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
009164e7f934f9fb6dacff612eb44f334ea9a6c3e4015e76ebced5643620cb98
|
|
| MD5 |
1f29ef8a57381d7016dc8f73c51f964b
|
|
| BLAKE2b-256 |
9e02d257dc8433ab112805a43ce5835a771d37a017c60d50d3820b8717cd7a8c
|