Skip to main content

OpenAI GPT, driven by Telegram

Project description

TeLLMgramBot

The basic goal of this project is to create a bridge between a Telegram Bot and a Large Langage Model (LLM), like OpenAI's GPT models.

  • To use this library, you must have a Telegram account with a user name, not just a phone number. If you don't have one, create one online.
  • If added to a Telgram group, the bot must be adminstrator in order to respond to a user calling out its name, initials, or nickname.

Telegram Bot + LLM Encapsulation

  • The Telegram interface handles special commands, especially on some basic "chatty" prompts and responses that don't require LLM, like "Hello".
  • The more dynamic conversation gets handed off to the LLM to manage prompts and responses, and Telegram acts as the interaction broker.
  • Pass the URL in [square brackets] and mention how the bot should interpret it.
    • Example: "What do you think of this article? [https://some_site/article]"
    • This uses another GPT model, preferably GPT-5 or GPT-4o, to support more URL content with its higher token limit.
  • Tokens are used to measure the length of all conversation messages between the Telegram bot assistant and the user. This is useful to:
    • Ensure the length does not go over the model limit. If it does, prune oldest messages to fit within the limit.
    • Remember 50% of the past conversations when starting up TeLLMgramBot again.
  • Users can also clear their conversation history for privacy.

Why Telegram?

Using Telegram as the interface not only solves "exposing" the interface, but gives you boadloads of interactivity over a standard Command Line interface, or trying to create a website with input boxes and submit buttons to try to handle everything:

  1. Telegram already lets you paste in verbose, multiline messages.
  2. Telegram already lets you paste in pictures, videos, links, etc.
  3. Telegram already lets you react with emojis, stickers, etc.

Directories

When initializing TeLLMgramBot, the following directories get created:

  • configs - Contains bot configuration files.
    • config.yaml (can be a different name)
      • This file sets main OpenAI parameters like naming and GPT models to process.
      • The parameter url_model is to read URL content, different than chat_model that the bot normally uses to interact with the user.
      • An empty token_limit would do the maximum amount of tokens supported by the chat_model (e.g. 128000 for gpt-4o-mini).
    • tokenGPT.yaml
      • This important YAML file contains token size parameters for supported OpenAI models.
      • If the first time, gpt-5, gpt-5-mini, gpt-5-nano, gpt-4o, and gpt-4o-mini get populated, but the user can specify more models with token size parameters as needed.
  • prompts - Contains prompt files for how the bot interacts with any user.
    • test_personality.prmpt (can be a different name)
      • This is a sample prompt file as a basis to test this library.
      • The user can create more prompt files as needed for different personalities. See OpenAI Playground to test some ideas.
    • url_analysis.prmpt
      • This is a crucial prompt file to analyze URL content in brackets [] in a different model (such as gpt-4o or gpt-4.1).
  • errorlogs
    • Contains a tellmgrambot_error.log file to investigate if there are problems during the interaction.
    • User will also get notified to contact the owner.
  • sessionlogs
    • Every conversation is stored between the Telegram bot assistant and each user.
    • If a user types /forget, any session log files between the bot and the user will all be removed.

Environment Variables

TeLLMgramBot also creates or utilizes the following environment variables that can be pre-loaded especially in other operating systems like Home Assistant, having different configurations for persistent storage:

  1. TELLMGRAMBOT_CONFIGS_PATH
  2. TELLMGRAMBOT_PROMPTS_PATH
  3. TELLMGRAMBOT_ERRORLOGS_PATH
  4. TELLMGRAMBOT_SESSIONLOGS_PATH

If neither of these are defined, the initialization would use the top-level execution run directory.

API Keys

To operate TeLLMgramBot, three API keys are required:

  • OpenAI - Drives the actual GPT AI.
  • Telegram - Offers a Bot API through BotFather for the messaging platform.
  • VirusTotal - Performs safety checks on URLs.

There are two ways to populate each API key: environment variables or .key files.

Environment Variables

TeLLMgramBot uses the following environment variables that can be pre-loaded with the three API keys respectively:

  1. TELLMGRAMBOT_OPENAI_API_KEY
  2. TELLMGRAMBOT_TELEGRAM_API_KEY
  3. TELLMGRAMBOT_VIRUSTOTAL_API_KEY

During spin-up time, a user can call out os.environ[env_var] to set those variables, like the following example:

my_keys = Some_Vault_Fetch_Function()

os.environ['TELLMGRAMBOT_OPENAI_API_KEY']     = my_keys['GPTKey']
os.environ['TELLMGRAMBOT_TELEGRAM_API_KEY']   = my_keys['BotFatherToken']
os.environ['TELLMGRAMBOT_VIRUSTOTAL_API_KEY'] = my_keys['VirusTotalToken']

This means the user can implement whatever key vault they want to fetch the keys at runtime, without needing files stored in the directory.

API Key Files

The other route is to create three files by the base path during execution or a specified environment variable TELLMGRAMBOT_KEYS_PATH. By default, three files are created for the user to input each API key:

  1. openai.key
  2. telegram.key
  3. virustotal.key

Each file with the associated API key will update its respective environment variable if not defined.

Bot Setup

This library includes an example script test_local.py, which uses files from the folders configs and prompts for TeLLMgramBot to process. The bot communicates with OpenAI via the Responses API, which replaces the older Chat Completions endpoint.

  1. Ensure the previous sections are followed with the proper API keys and your Telegram bot set.
  2. Install this library via PIP (pip install TeLLMgramBot) and then import into your project.
  3. Instantiate the bot by passing in various configuration pieces needed below. Note the Telegram bot's full name and username auto-populate before startup.
    telegram_bot = TeLLMgramBot.TelegramBot(
        bot_owner      = <Bot owner's Telegram username>,
        bot_nickname   = <Bot nickname like 'Botty'>,
        bot_initials   = <Bot initials like 'FB'>,
        chat_model     = <Conversation model like 'gpt-4o-mini'>,
        url_model      = <URL analysis model like 'gpt-4o'>,
        token_limit    = <Maximum token count set, by default chat_model max>,
        persona_temp   = <LLM factual to creative value [0-2], by default 1.0>,
        persona_prompt = <System prompt summarizing bot personality>
    )
    
  4. Turn on TeLLMgramBot by calling:
    telegram_bot.start_polling()
    
    Once you see TeLLMgramBot polling..., the bot is online in Telegram.
  5. Converse! Type /help for all available commands.

Resources

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tellmgrambot-2.3.0.tar.gz (24.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tellmgrambot-2.3.0-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file tellmgrambot-2.3.0.tar.gz.

File metadata

  • Download URL: tellmgrambot-2.3.0.tar.gz
  • Upload date:
  • Size: 24.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for tellmgrambot-2.3.0.tar.gz
Algorithm Hash digest
SHA256 7102cd33f6f5bde2a7f7e29e72f9b7cbf3887aaa15da5ee6ae2f16673c5ef4b4
MD5 3adf7166cf1182b4cf49c7a3b49deb18
BLAKE2b-256 1fe15b53684c6c910e66422eca1a64fa02599414ce441e55adae10a9acefc262

See more details on using hashes here.

File details

Details for the file tellmgrambot-2.3.0-py3-none-any.whl.

File metadata

  • Download URL: tellmgrambot-2.3.0-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for tellmgrambot-2.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b1ba20234b00c532f61708eda889f44ee945f2d6ab5aa6db0ad9c572cc4fcd96
MD5 9817aec95d638e929adc9fcd69866b97
BLAKE2b-256 e377e01b4dae11f9c6616d65b75bf24cc0ba2f32917c378067f201dde9008642

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page