Interact with AI without API key
Project description
python-tgpt
AI for all
>>> import tgpt
>>> bot = tgpt.TGPT()
>>> bot.chat('Hello there')
" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>>
This project enables seamless interaction with LLaMA AI without requiring an API Key.
The name python-tgpt draws inspiration from its parent project tgpt, which operates on golang. Through this Python adaptation, users can effortlessly engage with LLaMA's capabilities, fostering a smoother AI interaction experience.
Prerequisites
Installation and Usage
Installation
Choose one of the following methods to get started.
-
From PyPI:
pip install --upgrade python-tgpt
-
Directly from the source:
pip install git+https://github.com/Simatwa/python-tgpt.git
-
Clone and Install:
git clone https://github.com/Simatwa/python-tgpt.git cd python-tgpt pip install .
Usage
This package offers a convenient command-line interface.
-
For a quick response:
python -m tgpt generate "<Your prompt>"
-
For interactive mode:
python -m tgpt interactive "<Kickoff prompt (though not mandatory)>"
You can also simply use tgpt
instead of python -m tgpt
.
Starting from version 0.0.6, generate
is the default command. Therefore, something like this will still work: tgpt "<Your prompt>"
.
Developer Docs
- Generate a quick response
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
- Get back whole response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""
Stream Response
Just add parameter stream
with value true
.
- Text Generated only
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
print(value)
# output
"""
How may
How may I help
How may I help you
How may I help you today?
"""
- Whole Response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
- To get better feedback, you can make use of optimizers using parameter
optimizer
with values (code or system_command)
optimizer
with values (code or system_command)from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)
Note: Starting from v0.0.7, we've introduced an experimental conversational chatting feature:
bot = tgpt.TGPT(is_conversation=True)
When interacting via the console, simply append the --conversation
flag. Customize the history_offset
parameter during the initialization of TGPT
and utilize the --history-offset
flag at the console entry point for the same.
This conversational mode opens up a more interactive and engaging experience, allowing greater control over the chat history's handling for a more tailored conversation.
Acknowledgements
- tgpt
- You
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for python_tgpt-0.0.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8d284ffeabe25096575b3e23c66d084060a6b1856075756d33ca1c08cedddafc |
|
MD5 | 6ec1bfbee62b2032cf0edcff65fe0768 |
|
BLAKE2b-256 | e22a68deb31112dcbe8c84a27761087ab1fa8b25ca478835109cfbedd0f5a3dd |