Lexi is a local LLM-based solution - includes Chat UI, RAG, LLM Proxy, and Document importing
Project description
Lexi
Lexi is a local LLM-based solution - includes Chat UI, RAG, LLM Proxy, and Document importing
Currently supports Atlasian's Cloud Confluence.
System Overview
Getting started
Pre-requistes
- Python 3.8 upwards
- Docker (with Docker Compose)
Installation
pip install lexi
Configuration
lexi system create_envrc \
--confluence-url "https://some-company.atlassian.net/wiki" \
--confluence-space-key "some-company" \
--confluence-email "user@some-company.com" \
--confluence-space-name "" \
--compose-project-name "lexi-system" \
--litellm-log "INFO"
It will ask for your Confluence and OpenAIs API Keys
Start the system
lexi system up
Stop the system
lexi system down
Loading documents
After installing and setting up the environment with lexi system create_envrc
run the following to import the Confluence documents:
lexi load
You'll see the text from confluence being processed. It should end something like this:
Succesfully imported 172 documents and 759 chunks
Imported 172 documents into Weaviate.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lexi-0.0.1.tar.gz
(12.2 kB
view hashes)
Built Distribution
lexi-0.0.1-py3-none-any.whl
(17.0 kB
view hashes)