Skip to main content

Interact with AI without API key

Project description

License PyPi Black Passing Python Package flow coverage Progress Downloads Downloads Downloads Latest release release date wakatime

python-tgpt

>>> import pytgpt.phind as phind
>>> bot = phind.PHIND()
>>> bot.chat('hello there')
'Hello! How can I assist you today?'

This project enables seamless interaction with over 45 free LLM providers without requiring an API Key.

The name python-tgpt draws inspiration from its parent project tgpt, which operates on Golang. Through this Python adaptation, users can effortlessly engage with a number of free LLMs available, fostering a smoother AI interaction experience.

Features

  • 🗨️ Enhanced conversational chat experience
  • 💾 Capability to save prompts and responses (Conversation)
  • 🔄 Ability to load previous conversations
  • ⌨️ Command-line interface
  • 🐍 Python package
  • 🌊 Stream and non-stream response
  • 🚀 Ready to use (No API key required)
  • ⛓️ Chained requests via proxy
  • 🤖 Pass awesome-chatgpt prompts easily
  • 🧠 Multiple LLM providers - 45+
  • 🎯 Customizable script generation and execution

Providers

These are simply the hosts of the LLMs, which include:

  1. Leo - Brave
  2. Koboldai
  3. OpenGPTs
  4. OpenAI (API key required)
  5. WebChatGPT - OpenAI (Session ID required)
  6. Bard - Google (Session ID required)
  7. Phind - default
  8. Llama2
  9. Blackboxai

41+ Other models proudly offered by gpt4free.

All models. (Include not working)

  1. AItianhu
  2. AItianhuSpace
  3. Acytoo
  4. AiAsk
  5. AiChatOnline
  6. AiChatting
  7. AiService
  8. Aibn
  9. Aichat
  10. Ails
  11. Aivvm
  12. AsyncGeneratorProvider
  13. AsyncProvider
  14. Aura
  15. Bard
  16. BaseProvider
  17. Berlin
  18. Bestim
  19. Bing
  20. ChatAiGpt
  21. ChatAnywhere
  22. ChatBase
  23. ChatForAi
  24. Chatgpt4Online
  25. ChatgptAi
  26. ChatgptDemo
  27. ChatgptDemoAi
  28. ChatgptDuo
  29. ChatgptFree
  30. ChatgptLogin
  31. ChatgptNext
  32. ChatgptX
  33. Chatxyz
  34. CodeLinkAva
  35. CreateImagesProvider
  36. Cromicle
  37. DeepInfra
  38. DfeHub
  39. EasyChat
  40. Equing
  41. FakeGpt
  42. FastGpt
  43. Forefront
  44. FreeChatgpt
  45. FreeGpt
  46. GPTalk
  47. GeekGpt
  48. GeminiProChat
  49. GetGpt
  50. Gpt6
  51. GptChatly
  52. GptForLove
  53. GptGo
  54. GptGod
  55. GptTalkRu
  56. H2o
  57. Hashnode
  58. HuggingChat
  59. Koala
  60. Komo
  61. Liaobots
  62. Llama2
  63. Lockchat
  64. MikuChat
  65. MyShell
  66. Myshell
  67. OnlineGpt
  68. Opchatgpts
  69. OpenAssistant
  70. OpenaiChat
  71. PerplexityAi
  72. Phind
  73. Pi
  74. Poe
  75. Raycast
  76. RetryProvider
  77. TalkAi
  78. Theb
  79. ThebApi
  80. V50
  81. Vercel
  82. Vitalentum
  83. Wewordle
  84. Wuguokai
  85. Ylokh
  86. You
  87. Yqcloud

Prerequisites

Installation and Usage

Installation

Download binaries for your system from here.

Alternatively, you can install non-binaries. (Recommended)

  1. Developers:

    pip install "python-tgpt"
    
  2. Commandline:

    pip install "python-tgpt[cli]"
    
  3. Full installation:

    pip install "python-tgpt[all]"
    

Usage

This package offers a convenient command-line interface.

Note : phind is the default provider.

  • For a quick response:

    python -m pytgpt generate "<Your prompt>"
    
  • For interactive mode:

    python -m pytgpt interactive "<Kickoff prompt (though not mandatory)>"
    

Make use of flag --provider followed by the provider name of your choice. e.g --provider koboldai

To list all providers offered by gpt4free, use following commands: pytgpt gpt4free list providers

You can also simply use pytgpt instead of python -m pytgpt.

Starting from version 0.2.7, running $ pytgpt without any other command or option will automatically enter the interactive mode. Otherwise, you'll need to explicitly declare the desired action, for example, by running $ pytgpt generate.

Developer Docs

  1. Generate a quick response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>')
print(resp)
# Output : How may I help you.
  1. Get back whole response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt')
print(resp)
# Output
"""
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
"""

Stream Response

Just add parameter stream with value true.

  1. Text Generated only
from pytgpt.leo import LEO
bot = LEO()
resp = bot.chat('<Your prompt>', stream=True)
for value in resp:
    print(value)
# output
"""
How may
How may I help 
How may I help you
How may I help you today?
"""
  1. Whole Response
from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', stream=True)
for value in resp:
    print(value)
# Output
"""
{'completion': "I'm so", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}

{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
"""

Note : All providers have got a common class methods.

Openai
import pytgpt.openai as openai
bot = openai.OPENAI("<OPENAI-API-KEY>")
print(bot.chat("<Your-prompt>"))
Koboldai
import pytgpt.koboldai as koboldai
bot = koboldai.KOBOLDAI()
print(bot.chat("<Your-prompt>"))
Fakeopen
import pytgpt.fakeopen as fakeopen
bot = fakeopen.FAKEOPEN()
print(bot.chat("<Your-prompt>"))
Opengpt
import pytgpt.opengpt as opengpt
bot = opengpt.OPENGPT()
print(bot.chat("<Your-prompt>"))
Bard
import pytgpt.bard as bard
bot = bard.BARD('<Path-to-bard.google.com.cookies.json>')
print(bot.chat("<Your-prompt>"))
phind
import pytgp.phind as phind
bot = phind.PHIND()
print(bot.chat("<Your-prompt>"))
Gpt4free providers
import pytgpt.gpt4free as gpt4free
bot = gpt4free.GPT4FREE(provider="Aura")
print(bot.chat("<Your-prompt>"))

To obtain more tailored responses, consider utilizing optimizers using the optimizer parameter. Its values can be set to either code or system_command.

from pytgpt.leo import LEO
bot = LEO()
resp = bot.ask('<Your Prompt>', optimizer='code')
print(resp)

Note: Commencing from v0.1.0, the default mode of interaction is conversational. This mode enhances the interactive experience, offering better control over the chat history. By associating previous prompts and responses, it tailors conversations for a more engaging experience.

You can still disable the mode:

bot = koboldai.KOBOLDAI(is_conversation=False)

Utilize the --disable-conversation flag in the console to achieve the same functionality.

Warning : Bard autohandles context due to the obvious reason; the is_conversation parameter is not necessary at all hence not required when initializing the class. Also be informed that majority of providers offered by gpt4free requires Google Chrome inorder to function.

Advanced Usage of Placeholders

The generate functionality has been enhanced starting from v0.3.0 to enable comprehensive utilization of the --with-copied option and support for accepting piped inputs. This improvement introduces placeholders, offering dynamic values for more versatile interactions.

Placeholder Represents
{{stream}} The piped input
{{copied}} The last copied text

This feature is particularly beneficial for intricate operations. For example:

$ git diff | pytgpt generate "Here is a diff file: {{stream}} Make a concise commit message from it, aligning with my commit message history: {{copied}}" -p fakeopen --shell --new

In this illustration, {{stream}} denotes the result of the $ git diff operation, while {{copied}} signifies the content copied from the output of the $ git log command.

Introducing RawDog

RawDog is a masterpiece feature that exploits the versatile capabilities of Python to command and control your system as per your needs. You can do literally anything with it, since it generates and executes python codes, driven by your prompts! To have a bite of rawdog simply append the flag --rawdog shortcut -rd in generate/interactive mode. This introduces a never seen-before feature in the tgpt ecosystem. Thanks to AbanteAI/rawdog for the idea.

This can be useful in some ways. For instance :

$ pytgpt generate -n -q "Visualize the disk usage using pie chart" --rawdog

This will pop up a window showing system disk usage as shown below.

For more usage info run $ pytgpt --help

Usage: pytgpt [OPTIONS] COMMAND [ARGS]...

Options:
  -v, --version  Show the version and exit.
  -h, --help     Show this message and exit.

Commands:
  awesome      Perform CRUD operations on awesome-prompts
  generate     Generate a quick response with AI
  gpt4free     Discover gpt4free models, providers etc
  interactive  Chat with AI interactively (Default)
  utils        Utility endpoint for pytgpt
  webchatgpt   Reverse Engineered ChatGPT Web-Version

CHANGELOG

Acknowledgements

  1. tgpt
  2. gpt4free
  3. You

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python-tgpt-0.4.2.tar.gz (48.8 kB view hashes)

Uploaded Source

Built Distribution

python_tgpt-0.4.2-py3-none-any.whl (56.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page