paita - Python AI Textual Assistant
Project description
Paita - Python AI Textual Assistant
Paita is textual assistant for your terminal that supports multiple AI Services and models.
Key Features
- Supports Multiple AI Services: Paita integrates with a variety of AI services through the LangChain library. If AI service is compatible with LangChain then it can be used also with Paita.
- Textual User Interface on your terminal: Paita is based on Textual and provides a sophisticated user interface right within your terminal, combining the complexity of a GUI with console simplicity.
- Cross-Platform Compatibility: Paita is compatible with Windows, macOS, and Linux systems across most terminals; if Python runs in your environment and Textual supports it, then Paita will work.
- Supports Retrieval-Augmented Generation (RAG): Paita supports local vectorstore (Chroma) and crawling web page content.
Supported AI Services
- OpenAI
- AWS Bedrock
- Ollama (local models)
Getting Started
Prerequisites
- Python 3.8.1+
- Access to AI Service and configured in terminal
Installation and running
Install using pip (or pipx)
pip install paita
Run and enjoy!
paita
Some keyboard shortcuts
Paita is textual ui application so using keyboard shortcuts is recommended:
- Use
tab
andshift
+tab
to navigate between input field, send-button and question/answer boxes - While question/answer box is focus use
enter
to "focus-in" andesc
to "focus-out" - Use
c
to copy content from question/answer box - Contextual keyboard shortcuts are shown at the bottom of the UI
Configuring AI Service(s) and model access
OpenAI
OpenAI usage requires valid api key in environment variable.
export OPENAI_API_KEY=<OpenAI API Key>
AWS Bedrock
Enable AI model access in AWS Bedrock. Configure aws credential access accordingly.
Ollama
Ollama enables running chat models locally.
Install ollama for operating system or use official (docker image)[https://hub.docker.com/r/ollama/ollama]
Once ollama installed pull desired model from a registry e.g.
ollama pull llama3
By default paita connects to local Ollama endpoint. Optionally you can configure endpoint url with env variable:
export OLLAMA_ENDPOINT=<protocol>://<ollama-host-address>:<ollama-host-port>
Feedback
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file paita-0.1.13.tar.gz
.
File metadata
- Download URL: paita-0.1.13.tar.gz
- Upload date:
- Size: 314.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.27.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 38da89637e85d4e2824208f46189e2f78d9061c684ee867fc662e65803186220 |
|
MD5 | c54cf3279b8dc3eefe091d2a25ddd8f4 |
|
BLAKE2b-256 | cf4a79731eae50b3c96b64473f33fc404826a82a3d91a3cd7cd781b6cd345f29 |
File details
Details for the file paita-0.1.13-py3-none-any.whl
.
File metadata
- Download URL: paita-0.1.13-py3-none-any.whl
- Upload date:
- Size: 33.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.27.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1b246fbc1a901f088578ff13d84c983629a55626f2cdd94aeaaad6bc020e7411 |
|
MD5 | e3eaeecafbbb9bc2f31149f3bd70adcd |
|
BLAKE2b-256 | 1df19d277eee573a9e045f32dbac6eec2ad3ab9c3313fbe83bb767fbf7bf6036 |