Skip to main content

Lexi is a local LLM-based solution - includes Chat UI, RAG, LLM Proxy, and Document importing

Project description

Lexi

Lexi is a local LLM-based solution - includes Chat UI, RAG, LLM Proxy, and Document importing

Currently supports Atlasian's Cloud Confluence.

System Overview

System Overview

Getting started

Pre-requistes

  • Python 3.8 upwards
  • Docker (with Docker Compose)

Installation

pip install lexi

Configuration

lexi system create_envrc \
    --confluence-url "https://some-company.atlassian.net/wiki" \
    --confluence-space-key "some-company" \
    --confluence-email "user@some-company.com" \
    --confluence-space-name "" \
    --compose-project-name "lexi-system" \
    --litellm-log "INFO"

It will ask for your Confluence and OpenAIs API Keys

Start the system

lexi system up

Stop the system

lexi system down

Loading documents

After installing and setting up the environment with lexi system create_envrc run the following to import the Confluence documents:

lexi load

You'll see the text from confluence being processed. It should end something like this:

Succesfully imported 172 documents and 759 chunks
Imported 172 documents into Weaviate.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lexi-0.0.1.tar.gz (12.2 kB view hashes)

Uploaded Source

Built Distribution

lexi-0.0.1-py3-none-any.whl (17.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page