Interact with AI without API key
Project description
python-tgpt
>>> from pytgpt.leo import LEO
>>> bot = LEO()
>>> bot.chat('Hello there')
" Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?"
>>>
This project enables seamless interaction with over 45 free LLMs without requiring an API Key.
The name python-tgpt draws inspiration from its parent project tgpt, which operates on Golang. Through this Python adaptation, users can effortlessly engage with a number of free LLMs available, fostering a smoother AI interaction experience.
Features
- 🗨️ Enhanced conversational chat experience
- 💾 Capability to save prompts and responses (Conversation)
- 🔄 Ability to load previous conversations
- ⌨️ Command-line interface
- 🐍 Python package
- 🌊 Stream and non-stream response
- 🚀 Ready to use (No API key required)
- ⛓️ Chained requests via proxy
- 🤖 Pass awesome-chatgpt prompts easily
- 🧠 Multiple LLM providers - 45+
Providers
These are simply the hosts of the LLMs, which include:
- Leo - Brave
- FakeOpen
- Koboldai
- OpenGPTs
- OpenAI (API key required)
- WebChatGPT - OpenAI (Session ID required)
- Bard - Google (Session ID required)
41+ Other models proudly offered by gpt4free.
- AiChatOnline
- Aura
- Bard
- Bing
- ChatBase
- ChatForAi
- Chatgpt4Online
- ChatgptAi
- [ChatgptDemo](ChatgptDemo https://chat.chatgptdemo.net)
- ChatgptNext
- Chatxyz
- DeepInfra
- FakeGpt
- FreeChatgpt
- GPTalk
- GeekGpt
- GeminiProChat
- Gpt6
- GptChatly
- GptForLove
- GptGo
- GptTalkRu
- Hashnode
- HuggingChat
- Koala
- Liaobots
- Llama2
- MyShell
- OnlineGpt
- OpenaiChat
- PerplexityAi
- Phind
- Pi
- Poe
- Raycast
- TalkAi
- Theb
- ThebApi
- You
- Yqcloud
All models. (Include not working)
1 AItianhu 2 AItianhuSpace 3 Acytoo 4 AiAsk 5 AiChatOnline 6 AiChatting 7 AiService 8 Aibn 9 Aichat 10 Ails 11 Aivvm 12 AsyncGeneratorProvider 13 AsyncProvider 14 Aura 15 Bard 16 BaseProvider 17 Berlin 18 Bestim 19 Bing 20 ChatAiGpt 21 ChatAnywhere 22 ChatBase 23 ChatForAi 24 Chatgpt4Online 25 ChatgptAi 26 ChatgptDemo 27 ChatgptDemoAi 28 ChatgptDuo 29 ChatgptFree 30 ChatgptLogin 31 ChatgptNext 32 ChatgptX 33 Chatxyz 34 CodeLinkAva 35 CreateImagesProvider 36 Cromicle 37 DeepInfra 38 DfeHub 39 EasyChat 40 Equing 41 FakeGpt 42 FastGpt 43 Forefront 44 FreeChatgpt 45 FreeGpt 46 GPTalk 47 GeekGpt 48 GeminiProChat 49 GetGpt 50 Gpt6 51 GptChatly 52 GptForLove 53 GptGo 54 GptGod 55 GptTalkRu 56 H2o 57 Hashnode 58 HuggingChat 59 Koala 60 Komo 61 Liaobots 62 Llama2 63 Lockchat 64 MikuChat 65 MyShell 66 Myshell 67 OnlineGpt 68 Opchatgpts 69 OpenAssistant 70 OpenaiChat 71 PerplexityAi 72 Phind 73 Pi 74 Poe 75 Raycast 76 RetryProvider 77 TalkAi 78 Theb 79 ThebApi 80 V50 81 Vercel 82 Vitalentum 83 Wewordle 84 Wuguokai 85 Ylokh 86 You 87 Yqcloud
Prerequisites
- Python>=3.9 (Optional)
Installation and Usage
Installation
Download binaries for your system from here.
Alternatively, you can install non-binaries. (Recommended)
Choose one of the following methods to get started.
-
From PyPI:
pip install --upgrade python-tgpt
-
Directly from the source:
pip install git+https://github.com/Simatwa/python-tgpt.git
-
Clone and Install:
git clone https://github.com/Simatwa/python-tgpt.git cd python-tgpt pip install .
Usage
This package offers a convenient command-line interface.
Note :
Aurais the default provider.
-
For a quick response:
python -m pytgpt generate "<Your prompt>"
-
For interactive mode:
python -m pytgpt interactive "<Kickoff prompt (though not mandatory)>"
Make use of flag --provider postfixed with the provider name of your choice. e.g --provider koboldai
You can also simply use pytgpt instead of python -m pytgpt.
Starting from version 0.2.7, running $ pytgpt without any other command or option will automatically enter the interactive mode. Otherwise, you'll need to explicitly declare the desired action, for example, by running $ pytgpt generate.
Developer Docs
- Generate a quick response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
- Get back whole response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""
Stream Response
Just add parameter stream with value true.
- Text Generated only
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
print(value)
# output
"""
How may
How may I help
How may I help you
How may I help you today?
"""
- Whole Response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""
Note : All providers have got a common class methods.
Openai
import pytgpt.openai as openai
bot = openai.OPENAI("<OPENAI-API-KEY>")
print(bot.chat("<Your-prompt>"))
Koboldai
import pytgpt.koboldai as koboldai
bot = koboldai.KOBOLDAI()
print(bot.chat("<Your-prompt>"))
Fakeopen
import pytgpt.fakeopen as fakeopen
bot = fakeopen.FAKEOPEN()
print(bot.chat("<Your-prompt>"))
Opengpt
import pytgpt.opengpt as opengpt
bot = opengpt.OPENGPT()
print(bot.chat("<Your-prompt>"))
Bard
import pytgpt.bard as bard
bot = bard.BARD('<Path-to-bard.google.com.cookies.json>')
print(bot.chat("<Your-prompt>"))
Gpt4free providers
import pytgpt.gpt4free as gpt4free
bot = gpt4free.GPT4FREE(provider="Aura")
print(bot.chat("<Your-prompt>"))
To obtain more tailored responses, consider utilizing optimizers using the optimizer parameter. Its values can be set to either code or system_command.
optimizer parameter. Its values can be set to either code or system_command.from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)
Note: Commencing from v0.1.0, the default mode of interaction is conversational. This mode enhances the interactive experience, offering better control over the chat history. By associating previous prompts and responses, it tailors conversations for a more engaging experience.
You can still disable the mode:
bot = koboldai.KOBOLDAI(is_conversation=False)
Utilize the --disable-conversation flag in the console to achieve the same functionality.
Warning : Bard and WebChatGPT autohandles context due to the obvious reason; the
is_conversationparameter is not necessary at all hence not required when initializing the respective classes. Also be informed that majority of providers offered by gpt4free requires Google Chrome inorder to function.
Advanced Usage of Placeholders
The generate functionality has been enhanced starting from v0.3.0 to enable comprehensive utilization of the --with-copied option and support for accepting piped inputs. This improvement introduces placeholders, offering dynamic values for more versatile interactions.
| Placeholder | Represents |
|---|---|
{{stream}} |
The piped input |
{{copied}} |
The last copied text |
This feature is particularly beneficial for intricate operations. For example:
$ git diff | pytgpt generate "Here is a diff file: {{stream}} Make a concise commit message from it, aligning with my commit message history: {{copied}}" -p fakeopen --with-copied --shell --new
In this illustration,
{{stream}}denotes the result of the$ git diffoperation, while{{copied}}signifies the content copied from the output of the$ git logcommand.
For more usage info run $ pytgpt --help
$ pytgpt --helpUsage: pytgpt [OPTIONS] COMMAND [ARGS]...
Options:
-v, --version Show the version and exit.
-h, --help Show this message and exit.
Commands:
awesome Perform CRUD operations on awesome-prompts
generate Generate a quick response with AI
gpt4free Discover gpt4free models, providers etc
interactive Chat with AI interactively (Default)
utils Utility endpoint for pytgpt
webchatgpt Reverse Engineered ChatGPT Web-Version
Acknowledgements
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python-tgpt-0.3.2.tar.gz.
File metadata
- Download URL: python-tgpt-0.3.2.tar.gz
- Upload date:
- Size: 41.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
893ced762ef517acc0b00a19816c65c548cb4a5629c67a239e0a201c412a69d4
|
|
| MD5 |
7b904114edd0dd5414106723d6d4cf9c
|
|
| BLAKE2b-256 |
8547c4009cc1a6fd9ad3a9332f873232009f67871e7897446d00aedaa642ade6
|
File details
Details for the file python_tgpt-0.3.2-py3-none-any.whl.
File metadata
- Download URL: python_tgpt-0.3.2-py3-none-any.whl
- Upload date:
- Size: 45.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2954db82466cf5c9cb67089ef17c74ac5168e2acaccc3198def017c5846594a6
|
|
| MD5 |
3aa1a100caf46602d9576b11500a8269
|
|
| BLAKE2b-256 |
e105d7a23b4e1b59bd9b4c72c1e7ffc35329faa3dc5cabec5244d460696b17dc
|