Skip to main content

llamka - Attempt to make Second Brain - Simple stand alone UI for RAG loader pipline/chatbot/api

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

llamka

CI

MLOps for building second brain

There is few principles we want to build llamka around

Principle Because
Standardize interface for different model vendors Ability to substitute and store conversations uniformly
Automation of pipelines Consistency
Maintain teargeted conversation datasets Capture requirements and provide data for finetuining and evaluation
Model evaluation and management Informed decision making about trade off between open and closed models
Preference to open weight models and self hosting Ability to finetune, privacy and lower cost

Architecture

  • llore - (llamka-core) Tornado App that keep everything running. llore provides api and all other services built on top of it

    • MVP
      • provide abstraction interface over LLM providers (local: ollama; remote: openai, claude;)
      • maintain external document locations/versions and rebuild vector db collections on changes
      • augmentation pipelines and chatbot config
      • completion api to interact with configured models and chatbots
    • later
      • store chatbot interactions in oumi conversation format per user if instructed
      • maintain chatbot library (includes: system prompt, finetuning and augmentation pipelines, data dependencies, and build schedule)
      • maintain conversation datasets and use them for tuning, RAG and testing
      • schedule and run pipelines: finetuning, RAG, chatbot evaluation
      • monitor and run services: databases, queues, proxies, ui (llit)
  • llit - streamlit UI

    • MVP
      • Chatbot interaction
    • later
      • Inspect previous conversations or from library
      • Inspect data: sql, document db, vector db
      • Inspect logs from pipelines
      • Configuration: models, chatbot, pipelines

Got API and streamlit endpoints working, almost MVP!!!

Screenshot_2025-06-08


Project Docs

For how to install uv and Python, see installation.md.

For development workflows, see development.md.

For instructions on publishing to PyPI, see publishing.md.


This project was built from simple-modern-uv.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llamka-0.0.5.tar.gz (27.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llamka-0.0.5-py3-none-any.whl (29.5 kB view details)

Uploaded Python 3

File details

Details for the file llamka-0.0.5.tar.gz.

File metadata

  • Download URL: llamka-0.0.5.tar.gz
  • Upload date:
  • Size: 27.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llamka-0.0.5.tar.gz
Algorithm Hash digest
SHA256 a5fb907ff023265287679dd33c0e4c392eafd8a9a1fb1bb29be710b3dcf27443
MD5 3b604df2273ed3fcabe7c4f932c4447c
BLAKE2b-256 9d181368ccbfdaf89674b9c5a56255815d3643f5380da7e39e67902942c0c4f0

See more details on using hashes here.

Provenance

The following attestation bundles were made for llamka-0.0.5.tar.gz:

Publisher: publish.yml on walnutgeek/llamka

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llamka-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: llamka-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 29.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llamka-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 908886d6aaea8c7437eb3ac10204ede2c692d3d0eeb8360f8a6d6ccdc25e211f
MD5 8051088a10436fd6e9496e6c032fa5d1
BLAKE2b-256 22a52ad58467aa73cdeefaa1221fd58cee48cdd9f8e1fb4363437b4bb179f44b

See more details on using hashes here.

Provenance

The following attestation bundles were made for llamka-0.0.5-py3-none-any.whl:

Publisher: publish.yml on walnutgeek/llamka

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page