This project provides a FastAPI application to create and update GitHub gists using the GitHub API. It includes SQLite for persistence and is designed to run in a GitHub Codespace.
Project description
_____ _ _____ _ _
| _ |_| __|_|___| |_
| | | | | |_ -| _|
|__|__|_|_____|_|___|_|
Created by rUv
Ai Gist
Ai Gist is python application designed to help you create and manage GitHub gists effortlessly using the GitHub API. With built-in SQLite for persistence and seamless integration with AI language models (LLMs), Ai Gist offers powerful capabilities for automating your workflow.
Key Features
- Easy Gist Management: Create, update, and list GitHub gists with simple API calls.
- AI-Powered Interactions: Utilize advanced AI language models to interpret user messages and perform actions based on responses.
- SQLite Integration: Store and manage data locally with SQLite, ensuring persistence and reliability.
- FastAPI Framework: Benefit from the high performance and ease of use provided by the FastAPI framework.
Get started quickly by installing the application and running the server to leverage these features for an enhanced coding experience.
Setup
-
Create a GitHub Codespace:
- Create a new GitHub Codespace for your repository.
- Ensure your GitHub token is stored as a secret in the Codespace.
-
Install Dependencies:
- The setup script will automatically install necessary dependencies and set up the environment.
-
Install the Package:
- Install the package in editable mode:
pip install aigist
- Install the package in editable mode:
-
Initialize the Database:
- Run the database initialization script to create the necessary database and tables:
python init_db.py
- Run the database initialization script to create the necessary database and tables:
-
Run the Application:
- Start the FastAPI server using Uvicorn:
aigist
- Start the FastAPI server using Uvicorn:
Configuration
- Environment Variables:
- Ensure the following environment variables are set:
GITHUB_TOKEN
: Your GitHub token.LITELLM_API_BASE
: The base URL for the LiteLLM API.LITELLM_API_KEY
: Your LiteLLM API key.LITELLM_MODEL
: The model to use for LiteLLM (e.g., gpt-4o-2024-05-1).
- Ensure the following environment variables are set:
Endpoints
-
Create Gist:
POST /gists
- Request Body:
{ "description": "Sample Gist", "public": true, "files": [ { "filename": "sample.txt", "content": "This is a sample gist content." } ] }
- Request Body:
-
Update Gist:
PATCH /gists/{gist_id}
- Request Body:
{ "description": "Updated Sample Gist", "files": [ { "filename": "sample.txt", "content": "This is the updated sample gist content." } ] }
- Request Body:
-
List Gists:
GET /gists
- Query Parameters:
page
: The page number (default: 1).per_page
: The number of gists per page (default: 30, max: 100).since
: Filter gists created after this date (ISO 8601 format).until
: Filter gists created before this date (ISO 8601 format).
- Query Parameters:
-
Chat Completion:
POST /chat
- Request Body:
{ "messages": [ { "role": "user", "content": "Your message here" } ], "stream": false }
- Request Body:
-
Chat Gist:
POST /chat/gist
- Request Body:
{ "messages": [ { "role": "user", "content": "Your message here" } ], "stream": false }
- Request Body:
Testing
Run the tests using pytest:
pytest
Database Setup
To ensure the database and table setup is correct, run the init_db.py
script:
python init_db.py
Example JSON Payloads
-
Create Gist:
{ "description": "Sample Gist", "public": true, "files": [ { "filename": "sample.txt", "content": "This is a sample gist content." } ] }
-
Update Gist:
{ "description": "Updated Sample Gist", "files": [ { "filename": "sample.txt", "content": "This is the updated sample gist content." } ] }
Chat System
The chat system leverages LiteLLM to interpret user messages and perform actions based on the responses. The endpoint /chat/gist
handles incoming chat requests, processes them through LiteLLM, and performs actions such as creating or updating gists based on the responses.
How It Works
- Receiving Chat Messages: The endpoint
/chat/gist
receives a POST request with a list of chat messages. - Calling LiteLLM API: The
chat_completion
function sends these messages to the LiteLLM API using the GPT-4o model. - Processing the Response: Based on the
action
field in the response, the endpoint either creates or updates a gist. - Handling Errors: Any exceptions are caught and a 500 Internal Server Error is returned with the error details.
LiteLLM Features
LiteLLM offers numerous features that streamline interaction with various LLM providers:
- Unified Interface: Supports 100+ LLMs using the same Input/Output format, including OpenAI, Hugging Face, Anthropic, Cohere, and more.
- Error Handling and Retries: Automatic error handling and retry mechanism, switching to alternative providers if one fails.
- Streaming Support: Efficiently handle memory-intensive tasks by retrieving large model outputs in chunks.
- Open-Source and Community-Driven: Benefit from transparency and ongoing development by the open-source community.
- Reduced Complexity: Simplifies interactions with different provider APIs.
- Increased Flexibility: Allows experimentation with various LLMs.
- Improved Efficiency: Saves time with uniform code structure and automated error handling.
- Cost-Effectiveness: Optimizes costs by exploring different pricing models across providers.
For more details, refer to the LiteLLM documentation.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aigist-0.0.4.tar.gz
.
File metadata
- Download URL: aigist-0.0.4.tar.gz
- Upload date:
- Size: 10.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8f3ba2e51fa60c8a8a096abf8e29337ca95b71c4c17b7152fb05d33e2f4f9912 |
|
MD5 | 1d2e15d4e2b82eb67409804459d37bb4 |
|
BLAKE2b-256 | 55624d9d0675138d97b4345a18cf425a9be11e826c1899e674653210d7176f8d |
File details
Details for the file aigist-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: aigist-0.0.4-py3-none-any.whl
- Upload date:
- Size: 11.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 714e133ce4028e13330e972501459b5879300927ee23d0b826ae3474e3e9ce7b |
|
MD5 | bbb74b15f65381795cc103dc3c23d6f3 |
|
BLAKE2b-256 | cfc04f17babc98848c8d9e7f8fa4aa6da546f388a32974155261545eaeeb21dd |