Interact with AI without API key
Project description
python-tgpt
AI for all
>>> import tgpt
>>> bot = tgpt.TGPT()
>>> bot.chat('Hello there')
" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>>
This project allows you to interact with AI (LLaMA) without API Key.
The name python-tgpt is inherited from it's parent project tgpt which runs on golang.
Prerequisite
Installation and usage
Installation
Pick either of the following ways to get started.
- From pypi:
pip install python-tgpt
- Direct from source
pip install git+https://github.com/Simatwa/python-tgpt.git
- Clone and Install
git clone https://github.com/Simatwa/python-tgpt.git
cd python-tgpt
pip install .
Usage
This package features a ready to use commandline interface.
-
Quick response
python -m tgpt generate "<Your prompt>"
-
Interactive mode
python -m tgpt interactive "<Kickoff prompt though not a must>"
Instead of python -m tgpt
, you can as well just use tgpt
As from version 0.0.6, generate
is the default command. So something like this will still work. tgpt "<Your prompt>"
Developer Docs
- Generate a quick response
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
- Get back whole response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""
Stream Response
Just add parameter stream
with value true
.
- Text Generated only
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
print(value)
# output
"""
How may
How may I help
How may I help you
How may I help you today?
"""
- Whole Response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
- To get better feedback, you can make use of optimizers using parameter
optimizer
with values (code or system_command)
optimizer
with values (code or system_command)from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)
Note : As of v0.0.7, chatting conversationally has been featured (Experimental) :
bot = tgpt.TGPT(is_conversation=True)
at console just append flag--conversation
. NB: Tends to fail after relative lengthy chat.
Acknowledgements
- tgpt
- You
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for python_tgpt-0.0.7-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 46908039434345be61916176bdcbdca0f3f03c198a01800a6f8e5394836e7b66 |
|
MD5 | 202ef28204d31435b4f5dcca3a26af93 |
|
BLAKE2b-256 | d35f6366bf510956f89d6e3a9b20a55eeb42af27d44aa8791426f0b5606feda8 |