Skip to main content

Interact with AI without API key

Project description

python-tgpt

License PyPi Black Passing coverage Progress Downloads Downloads Latest release release date wakatime

AI for all

>>> import tgpt
>>> bot = tgpt.TGPT()
>>> bot.chat('Hello there')
"  Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>> 

This project allows you to interact with AI (LLaMA) without API Key.

The name python-tgpt is inherited from it's parent project tgpt which runs on golang.

Prerequisite

Installation and usage

Installation

Pick either of the following ways to get started.

  1. From pypi:
pip install python-tgpt
  1. Direct from source
pip install git+https://github.com/Simatwa/python-tgpt.git
  1. Clone and Install
git clone https://github.com/Simatwa/python-tgpt.git
cd python-tgpt
pip install .

Usage

This package features a ready to use commandline interface.

  • Quick response python -m tgpt generate "<Your prompt>"

  • Interactive mode python -m tgpt interactive "<Kickoff prompt though not a must>"

Instead of python -m tgpt, you can as well just use tgpt

As from version 0.0.6, generate is the default command. So something like this will still work. tgpt "<Your prompt>"

Developer Docs

  1. Generate a quick response
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
  1. Get back whole response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""

Stream Response

Just add parameter stream with value true.

  1. Text Generated only
from tgpt import TGPT
bot = TGPT()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
    print(value)
# output
"""
How may
How may I help 
How may I help you
How may I help you today?
"""
  1. Whole Response
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
    print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
  • To get better feedback, you can make use of optimizers using parameter optimizer with values (code or system_command)
from tgpt import TGPT
bot = TGPT()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)

Note : As of v0.0.7, chatting conversationally has been featured (Experimental) : bot = tgpt.TGPT(is_conversation=True) at console just append flag --conversation. NB: Tends to fail after relative lengthy chat.

Acknowledgements

  1. tgpt
  2. You

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python-tgpt-0.0.7.tar.gz (13.8 kB view hashes)

Uploaded Source

Built Distribution

python_tgpt-0.0.7-py3-none-any.whl (15.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page