Interact with AI without API key
Project description
python-tgpt
>>> import tgpt
>>> bot = tgpt.TGPT()
>>> bot.chat('Hello there')
" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>>
This project enables seamless interaction with LLaMA AI without requiring an API Key.
The name python-tgpt draws inspiration from its parent project tgpt, which operates on Golang. Through this Python adaptation, users can effortlessly engage with LLaMA's capabilities (alias LEO by Brave), fostering a smoother AI interaction experience.
Features
- 🗨️ Enhanced conversational chat experience
- 💾 Capability to save prompts and responses (Conversation)
- 🔄 Ability to load previous conversations
- ⌨️ Command-line interface
- 🐍 Python package
- 🌊 Stream and non-stream response
- 🚀 Ready to use (No API key required)
- ⛓️ Chained requests via proxy
- 🤖 Pass awesome-chatgpt prompts easily
Prerequisites
- Python>=3.9 (Optional)
Installation and Usage
Installation
Download binaries for your system from here.
Alternatively, you can install non-binaries. (Recommended)
Choose one of the following methods to get started.
-
From PyPI:
pip install --upgrade python-tgpt
-
Directly from the source:
pip install git+https://github.com/Simatwa/python-tgpt.git
-
Clone and Install:
git clone https://github.com/Simatwa/python-tgpt.git cd python-tgpt pip install .
Usage
This package offers a convenient command-line interface.
-
For a quick response:
python -m tgpt generate "<Your prompt>"
-
For interactive mode:
python -m tgpt interactive "<Kickoff prompt (though not mandatory)>"
You can also simply use tgpt instead of python -m tgpt.
Starting from version 0.1.2, generate is the default command if you issue a prompt, and interactive takes action if you don't. Therefore, something like this will generate a response and exit $ tgpt "<Your prompt>" while $ tgpt will fire up an interactive prompt.
Developer Docs
- Generate a quick response
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
- Get back whole response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""
Stream Response
Just add parameter stream with value true.
- Text Generated only
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
print(value)
# output
"""
How may
How may I help
How may I help you
How may I help you today?
"""
- Whole Response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
To obtain more tailored responses, consider utilizing optimizers using the optimizer parameter. Its values can be set to either code or system_command.
optimizer parameter. Its values can be set to either code or system_command.from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)
Note: Commencing from v0.1.0, the default mode of interaction is conversational. This mode enhances the interactive experience, offering better control over the chat history. By associating previous prompts and responses, it tailors conversations for a more engaging experience.
You can still disable the mode:
bot = tgpt.TGPT(is_conversation=False)
Utilize the --disable-conversation flag in the console to achieve the same functionality.
Acknowledgements
- tgpt
- You
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python-tgpt-0.1.3.tar.gz.
File metadata
- Download URL: python-tgpt-0.1.3.tar.gz
- Upload date:
- Size: 18.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e3560812434c6704d1c7d3118eb28895951f3bc9b98e465b9384b09fcee06503
|
|
| MD5 |
8e6aa0cd84670632a34e0ce798302638
|
|
| BLAKE2b-256 |
b0fcbab8b964281d2462fdc199f084c2fd7be02af644dd3b3b7e84fe05508a04
|
File details
Details for the file python_tgpt-0.1.3-py3-none-any.whl.
File metadata
- Download URL: python_tgpt-0.1.3-py3-none-any.whl
- Upload date:
- Size: 18.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c6440a43f184d552d29efb0578133b03d009d3581078f2b08c8965e81504355d
|
|
| MD5 |
3e459848af65fb2e210a745950230bf6
|
|
| BLAKE2b-256 |
c3a592bf35c34277adf9535ccdb7216653d4712c8422a4600aa5ae1f74b6dbb6
|