MyText: A Minimal AI-Powered Text Rewriting Tool
Project description
Overview
MyText is a lightweight AI-powered text enhancement tool that rewrites, paraphrases, and adjusts tone using modern LLM providers. It offers a clean command-line interface and a minimal Python API, supports multiple providers (Google AI Studio & Cloudflare Workers AI), and automatically selects the first available provider based on your environment variables.
| PyPI Counter | |
| Github Stars |
| Branch | main | dev |
| CI |
| Code Quality |
Installation
Source Code
- Download Version 0.7 or Latest Source
pip install .
PyPI
- Check Python Packaging User Guide
pip install mytext==0.7
Usage
CLI
Single Run
Executes a one-time text transformation using the provided options and exits immediately after producing the result.
mytext \
--mode="paraphrase" \
--tone="formal" \
--text="Can you update me on the project timeline by the end of the day?"
Loop
Starts an interactive session that repeatedly accepts new text inputs from the user while keeping the same configuration until the process is terminated.
mytext \
--mode="paraphrase" \
--tone="formal" \
--loop
Arguments
| Argument | Description | Default |
|---|---|---|
--text |
Text to process (required unless --loop is used) |
- |
--mode |
Text processing mode | paraphrase |
--tone |
Output text desired tone | neutral |
--provider |
AI provider selection | auto |
--loop |
Enable interactive loop mode | false |
--model |
Override provider LLM model | - |
--version |
Show application version | - |
--info |
Show application information | - |
ℹ️ Supported modes: paraphrase, grammar, summarize, simplify, bulletize, shorten, emojify
ℹ️ Supported tones: neutral, formal, casual, friendly, professional, academic, creative, biblical, viking, zen, corporate
ℹ️ Supported providers: auto, ai-studio, cloudflare, openrouter, cerebras, groq, nvidia, github
Library
You can also use MyText directly inside Python.
from mytext import run_mytext
from mytext import Mode, Tone, Provider
auth = {"api_key": "YOUR_KEY"}
result = run_mytext(
text="Let me know if you have any questions after reviewing the attached document.",
auth=auth,
mode=Mode.PARAPHRASE,
tone=Tone.NEUTRAL,
provider=Provider.AI_STUDIO
)
print(result["status"], result["message"])
Parameters
| Parameter | Description | Default |
|---|---|---|
text |
Input text to process | - |
auth |
Authentication parameters for the provider | - |
mode |
Text processing mode | Mode.PARAPHRASE |
tone |
Output text desired tone | Tone.NEUTRAL |
provider |
AI provider | Provider.AI_STUDIO |
model |
Override provider LLM model | None |
Supported Providers
MyText automatically detects which providers are available based on environment variables.
Each provider has a default model. You may optionally override it using either the CLI --model argument or a *_MODEL environment variable.
| Provider | Required Environment Variables | Default Model | Optional Model Override |
|---|---|---|---|
| AI Studio | AI_STUDIO_API_KEY |
gemma-3-1b-it |
AI_STUDIO_MODEL |
| Cloudflare | CLOUDFLARE_API_KEY, CLOUDFLARE_ACCOUNT_ID |
meta/llama-3-8b-instruct |
CLOUDFLARE_MODEL |
| OpenRouter | OPENROUTER_API_KEY |
openai/gpt-oss-20b:free |
OPENROUTER_MODEL |
| Cerebras | CEREBRAS_API_KEY |
llama3.1-8b |
CEREBRAS_MODEL |
| Groq | GROQ_API_KEY |
openai/gpt-oss-20b |
GROQ_MODEL |
| NVIDIA | NVIDIA_API_KEY |
meta/llama-3.1-8b-instruct |
NVIDIA_MODEL |
| GITHUB | GITHUB_API_KEY |
openai/gpt-4o-mini |
GITHUB_MODEL |
Configuration Resolution Priority
MyText supports multiple configuration sources (CLI arguments, environment variables, and built-in defaults).
When resolving any configurable parameter (e.g., model), MyText follows this priority order:
- CLI argument (highest priority)
- Corresponding environment variable
- Built-in default value (lowest priority)
Issues & Bug Reports
Just fill an issue and describe it. We'll check it ASAP!
- Please complete the issue template
Show Your Support
Star This Repo
Give a ⭐️ if this project helped you!
Donate to Our Project
Bitcoin
1KtNLEEeUbTEK9PdN6Ya3ZAKXaqoKUuxCyEthereum
0xcD4Db18B6664A9662123D4307B074aE968535388Litecoin
Ldnz5gMcEeV8BAdsyf8FstWDC6uyYR6pgZDoge
DDUnKpFQbBqLpFVZ9DfuVysBdr249HxVDhTron
TCZxzPZLcJHr2qR3uPUB1tXB6L3FDSSAx7Ripple
rN7ZuRG7HDGHR5nof8nu5LrsbmSB61V1qqBinance Coin
bnb1zglwcf0ac3d0s2f6ck5kgwvcru4tlctt4p5qefTether
0xcD4Db18B6664A9662123D4307B074aE968535388Dash
Xd3Yn2qZJ7VE8nbKw2fS98aLxR5M6WUU3sStellar
GALPOLPISRHIYHLQER2TLJRGUSZH52RYDK6C3HIU4PSMNAV65Q36EGNLZilliqa
zil1knmz8zj88cf0exr2ry7nav9elehxfcgqu3c5e5Coffeete
Changelog
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
Unreleased
0.7 - 2026-04-24
Added
emojifymodebiblicaltonevikingtonezentonecorporatetone- Tone hint
Changed
README.mdupdated- Cerebras default model changed to
llama3.1-8b - Instructions modified
0.6 - 2026-03-11
Added
- GitHub provider
Changed
- CLI functions moved to
cli.py - CLI messages updated
- CLI modified
- OpenRouter default model changed to
openai/gpt-oss-20b:free - Test system modified
README.mdupdated
0.5 - 2026-02-18
Added
--providerargument--modelargument_load_model_from_envfunction
Changed
modelparameter added torun_mytextfunction- AI Studio default model changed to
gemma-3-1b-it - OpenRouter default model changed to
google/gemma-3-27b-it:free - Test system modified
README.mdupdated
0.4 - 2025-12-25
Added
- Groq provider
- NVIDIA provider
--loopargument
Changed
- Test system modified
README.mdupdated
0.3 - 2025-12-17
Added
- OpenRouter provider
- Cerebras provider
Changed
- Test system modified
README.mdupdated- AI Studio main model changed to
gemini-2.5-flash - AI Studio fallback model changed to
gemma-3-1b-it - Providers moved to
providers.py
0.2 - 2025-12-05
Added
- Logo
summarizemodesimplifymodebulletizemodeshortenmode
Changed
README.mdupdated- Cloudflare fallback model changed to
meta/llama-3.1-8b-instruct-fast - Model switching modified
0.1 - 2025-11-26
Added
run_mytextfunction- AI Studio provider
- Cloudflare provider
--modeargument--toneargument
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mytext-0.7.tar.gz.
File metadata
- Download URL: mytext-0.7.tar.gz
- Upload date:
- Size: 19.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
10b6fe7406ed2cbd2c3ab9d1dac067867559c207d3fecf1d299f8273d031fb41
|
|
| MD5 |
855feaef207333f423730da38b2e0044
|
|
| BLAKE2b-256 |
694da48918d712c736e797f655ae6d2c15bac0613751abee1a8b9935db0b0241
|
File details
Details for the file mytext-0.7-py3-none-any.whl.
File metadata
- Download URL: mytext-0.7-py3-none-any.whl
- Upload date:
- Size: 14.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d15789ec6dffe2c6f79257da937ac4879453f57d62533de6a110499582035158
|
|
| MD5 |
6ae7d8a65e15fd46fe6b90bd780a9e48
|
|
| BLAKE2b-256 |
6962547c5c6deac1d25d0054e04ffb42aabd64039934e04360a30c5cff19bfdb
|