Cognee - is a library for enriching LLM context with a semantic layer for better understanding and reasoning.
Project description
cognee
Deterministic LLMs Outputs for AI Engineers using graphs, LLMs and vector retrieval
Open-source framework for creating self-improving deterministic outputs for LLMs.
Try it in a Google collab notebook or have a look at our documentation
If you have questions, join our Discord community
📦 Installation
With pip
pip install cognee
With poetry
poetry add cognee
💻 Usage
Setup
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
or
import cognee
cognee.config.llm_api_key = "YOUR_OPENAI_API_KEY"
To run the UI, run:
docker-compose up cognee
Then navigate to localhost:3000/wizard
You can also use Ollama or Anyscale as your LLM provider. For more info on local models check our docs
Run
import cognee
text = """Natural language processing (NLP) is an interdisciplinary
subfield of computer science and information retrieval"""
cognee.add([text], "example_dataset") # Add a new piece of information
cognee.cognify() # Use LLMs and cognee to create knowledge
search_results = cognee.search("SIMILARITY", "computer science") # Query cognee for the knowledge
for result_text in search_results[0]:
print(result_text)
Add alternative data types:
cognee.add("file://{absolute_path_to_file}", dataset_name)
Or
cognee.add("data://{absolute_path_to_directory}", dataset_name)
# This is useful if you have a directory with files organized in subdirectories.
# You can target which directory to add by providing dataset_name.
# Example:
# root
# / \
# reports bills
# / \
# 2024 2023
#
# cognee.add("data://{absolute_path_to_root}", "reports.2024")
# This will add just directory 2024 under reports.
Read more here.
Vector retrieval, Graphs and LLMs
Cognee supports a variety of tools and services for different operations:
-
Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.
-
Vector Stores: Cognee supports Qdrant and Weaviate for vector storage.
-
Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.
-
Graph Stores: In addition to LanceDB, Neo4j is also supported for graph storage.
Demo
Check out our demo notebook here
How it works
Star History
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.