Cognee - is a library for enriching LLM context with a semantic layer for better understanding and reasoning.
Project description
cognee
Deterministic LLMs Outputs for AI Engineers using graphs, LLMs and vector retrieval
Open-source framework for creating self-improving deterministic outputs for LLMs.
Try it in a Google collab notebook or have a look at our documentation
If you have questions, join our Discord community
📦 Installation
With pip
pip install cognee
With poetry
poetry add cognee
💻 Usage
Setup
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
or
import cognee
cognee.config.llm_api_key = "YOUR_OPENAI_API_KEY"
If you are using Networkx, create an account on Graphistry to visualize results:
cognee.config.set_graphistry_username = "YOUR_USERNAME"
cognee.config.set_graphistry_password = "YOUR_PASSWORD"
To run the UI, run:
docker-compose up cognee
Then navigate to localhost:3000/wizard
You can also use Ollama or Anyscale as your LLM provider. For more info on local models check our docs
Run
import cognee
text = """Natural language processing (NLP) is an interdisciplinary
subfield of computer science and information retrieval"""
await cognee.add([text], "example_dataset") # Add a new piece of information
await cognee.cognify() # Use LLMs and cognee to create knowledge
await search_results = cognee.search("SIMILARITY", {'query': 'Tell me about NLP'}) # Query cognee for the knowledge
print(search_results)
Add alternative data types:
cognee.add("file://{absolute_path_to_file}", dataset_name)
Or
cognee.add("data://{absolute_path_to_directory}", dataset_name)
# This is useful if you have a directory with files organized in subdirectories.
# You can target which directory to add by providing dataset_name.
# Example:
# root
# / \
# reports bills
# / \
# 2024 2023
#
# cognee.add("data://{absolute_path_to_root}", "reports.2024")
# This will add just directory 2024 under reports.
Read more here.
Vector retrieval, Graphs and LLMs
Cognee supports a variety of tools and services for different operations:
-
Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.
-
Vector Stores: Cognee supports Qdrant and Weaviate for vector storage.
-
Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.
-
Graph Stores: In addition to LanceDB, Neo4j is also supported for graph storage.
Demo
Check out our demo notebook here
How it works
Star History
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cognee-0.1.14.tar.gz
.
File metadata
- Download URL: cognee-0.1.14.tar.gz
- Upload date:
- Size: 318.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a668b55cf4ccae8c27066b696caeb05a73c9bfbe476f4e4903523c2e182c242 |
|
MD5 | b4857758ad5d73db2f64635fa1876f28 |
|
BLAKE2b-256 | 767b5ad74bd8ca9f1bd20ff6e8155d34b37bad224f1eb80c7f89c8e00f2e38a9 |
File details
Details for the file cognee-0.1.14-py3-none-any.whl
.
File metadata
- Download URL: cognee-0.1.14-py3-none-any.whl
- Upload date:
- Size: 394.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.2 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 74c89f970509c2899c1555d4f4a5cf779fbc6533a76a7c084ddd182a17fde06b |
|
MD5 | c1c00f8528400945c665aa070ae88ffb |
|
BLAKE2b-256 | c6353e887ee0f6388b43f1f4ff253782fa82595c8567e526e2c788f2dfdabe9b |