Skip to main content

A minimal LLM network chat server/client app.

Project description

chatthy

An asynchronous terminal server/multiple-client setup for conducting and managing chats with LLMs.

This is the successor project to llama-farm

The RAG/agent functionality should be split out into an API layer.

network architecture

  • client/server RPC-type architecture
  • message signing
  • ensure stream chunk ordering

chat management

  • basic chat persistence and management
  • set, switch to saved system prompts (personalities)
  • manage prompts like chats (as files)
  • chat truncation to token length
  • rename chat
  • profiles (profile x personalities -> sets of chats)
  • import/export chat to client-side file
  • remove text between tags when saving

context workspace

  • context workspace (load/drop files)
  • client inject from file
  • client inject from other sources, e.g. youtube (trag)
  • templates for standard instruction requests (trag)
  • context workspace - bench/suspend files (hidden by filename)
  • local files / folders in transient workspace
  • checkboxes for delete / show / hide

client interface

  • can switch between Anthropic, OpenAI, tabbyAPI providers and models
  • streaming
  • syntax highlighting
  • decent REPL
  • REPL command mode
  • cut/copy from output
  • client-side prompt editing
  • vimish keys in output
  • client-side chat/message editing (how? temporarily set the input field history? Fire up $EDITOR in client?) - edit via chat local import/export
  • latex rendering (this is tricky in the context of prompt-toolkit, but see flatlatex).
  • generation cancellation
  • tkinter UI

multimodal

  • design with multimodal models in mind
  • image sending and use
  • image display

miscellaneous / extensions

  • use proper config dir (group?)
  • dump default conf if missing

tool / agentic use

Use agents at the API level, which is to say, use an intelligent router. This separates the chatthy system from the RAG/LLM logic.

  • (auto) tools (evolve from llama-farm -> trag)
  • user defined tool plugins
  • server use vdb context at LLM will (tool)
  • iterative workflows (refer to llama-farm, consider smolagents)
  • tool chains
  • tool: workspace file write, delete
  • tool: workspace file patch/diff
  • tool: rag query tool
  • MCP agents?
  • smolagents / archgw?

RAG

  • summaries and standard client instructions (trag)
  • server use vdb context on request
  • set RAG provider client-side (e.g. Mistral Small, Phi-4)
  • consider best method of pdf conversion / ingestion (fvdb), OOB (image models?)
  • full arxiv paper ingestion (fvdb) - consolidate into one latex file OOB
  • vdb result reranking with context, and winnowing (agent?)
  • vdb results -> workspace (agent?)

unallocated / out of scope

audio streaming ? - see matatonic's servers workflows (tree of instruction templates) tasks

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatthy-0.2.11.tar.gz (40.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatthy-0.2.11-py3-none-any.whl (41.0 kB view details)

Uploaded Python 3

File details

Details for the file chatthy-0.2.11.tar.gz.

File metadata

  • Download URL: chatthy-0.2.11.tar.gz
  • Upload date:
  • Size: 40.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for chatthy-0.2.11.tar.gz
Algorithm Hash digest
SHA256 df0210761d3c377e565e19f99a7f0b59ff80e4af534ae1bcc851b688036a2df2
MD5 5a9e55b5b5631c96fb8f5f1fd3467dc4
BLAKE2b-256 7ed1b21023e96caadff1528d299028a4e41ebb0ddfcb54cfdd376e074fa09464

See more details on using hashes here.

File details

Details for the file chatthy-0.2.11-py3-none-any.whl.

File metadata

  • Download URL: chatthy-0.2.11-py3-none-any.whl
  • Upload date:
  • Size: 41.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for chatthy-0.2.11-py3-none-any.whl
Algorithm Hash digest
SHA256 5a466191b6143ce67ac38abb8a8d29fd4266ead9177f20865b9c0fb4655c2669
MD5 d38210348a38743edb4effea82189875
BLAKE2b-256 a5a288e6ae6ec2a9530ddfb3fcd52bcfd229f380ab5c9aa7feef38fff1ce8849

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page