Skip to main content

Packaging tools for own use

Project description

hwhkit

Main function

  • Connection
    • mqtt
  • llm

Connection

Sync MQTT

import asyncio
import time

from hwhkit.connection.mqtt.client import MQTTClientManager


async def main():
    client_id = "test_mqtt_client"
    kes = "./mqtt_key_pairs.yaml"
    manager = MQTTClientManager(mqtt_config=kes)
    manager.create_client(client_id=client_id, broker="broker.emqx.io", port=1883)
    manager.start_all_clients()

    @manager.subscribe(topic="topic_key")
    def handle_message(client, message: str):
        print(f"Received message from {client._client_id}: {message}")
        manager.publish(client_id, "topic_key", f"Response from {client._client_id}")

    try:
        while True:
            time.sleep(2)
            manager.publish(client_id=client_id, topic="topic_key", message="Hello from Client2")
    except KeyboardInterrupt:
        print("Exiting...")

if __name__ == '__main__':
    asyncio.run(main())

Async MQTT

import asyncio
from hwhkit.connection.mqtt.async_client import MQTTClientManager, MQTTConfig
from hwhkit.utils import logger


async def main():
    configs = [
        MQTTConfig(
            client_id="client1",
            broker="broker.emqx.io",
            port=1883,
            # username="user1",
            # password="pass1"
        ),
    ]

    kes = "./mqtt_key_pairs.yaml"
    async with MQTTClientManager(mqtt_config=kes) as manager:
        for config in configs:
            await manager.add_client(config)

        @manager.topic_handler("topic_key")
        async def topic_key(client, topic, message):
            logger.info(f"Received message on {topic} from {client}: {message}")

        try:
            await manager.run()
        except KeyboardInterrupt:
            logger.info("Shutting down...")

if __name__ == "__main__":
    asyncio.run(main())

LLM

Three steps to use models

Step1, llm_config.yaml

matter that needs attention

  1. A_custom_model_name used for models.get_model_instance()
  2. A_custom_model_name.name should specify the name of the model supported by the current company
models:
  openai:
    A_custom_model_name:
      name: "gpt-4o"
      short_name: "OIG4"
      company: "openai"
      max_input_token: 8100
      max_output_token: 2048
      top_p: 0.5
      top_k: 1
      temperature: 0.5
      input_token_fee_pm: 30.0
      output_token_fee_pm: 60.0
      train_token_fee_pm: 0.0
      keys:
        - name: "openai_key1"
        - name: "openai_key2"

  siliconflow:
    qw-72b-p:
      name: "Qwen/QVQ-72B-Preview"
      short_name: "QW-72B-P"
      company: "siliconflow"
      max_input_token: 8100
      max_output_token: 2048
      top_p: 0.5
      top_k: 1
      temperature: 0.5
      input_token_fee_pm: 30.0
      output_token_fee_pm: 60.0
      train_token_fee_pm: 0.0
      keys:
        - name: "siliconflow_1"
Step2, llm_keys.yaml
  1. The keys name of the model in llm_config.yaml corresponds to llm_keys.yaml one by one
keys:
  openai_key1: "xx"
  openai_key2: "xx"
  anthropic_key1: "your_anthropic_api_key_1"
  anthropic_key2: "your_anthropic_api_key_2"
Step3, load models
from hwhkit.llm.config import load_models_from_yaml


async def main():
    models = load_models_from_yaml(config_file="llm_config.yaml", keys_file="llm_keys.yaml")
    print(models.list_models())

    resp = await models.get_model_instance("gpt-4o").chat("who r u?")
    print(resp)


if __name__ == '__main__':
    import asyncio
    asyncio.run(main())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hwhkit-1.0.8.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hwhkit-1.0.8-py3-none-any.whl (26.8 kB view details)

Uploaded Python 3

File details

Details for the file hwhkit-1.0.8.tar.gz.

File metadata

  • Download URL: hwhkit-1.0.8.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.10

File hashes

Hashes for hwhkit-1.0.8.tar.gz
Algorithm Hash digest
SHA256 2e532e2356850247bf053e32deff226649c9d38a84034797c705708f9fcf9602
MD5 435b405eb25e802db2fc046c79177532
BLAKE2b-256 fa8642fa244fc136027f6dd37bc64794ba7efa4edd044c834dcc931f56399e9d

See more details on using hashes here.

File details

Details for the file hwhkit-1.0.8-py3-none-any.whl.

File metadata

  • Download URL: hwhkit-1.0.8-py3-none-any.whl
  • Upload date:
  • Size: 26.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.10

File hashes

Hashes for hwhkit-1.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 234d2c3b0a5a189484cc32a40fbeac8e7967c72c1fcc7a8cc59aa162a9eb7f17
MD5 fa77cd9c993108b24d11a5d04941901d
BLAKE2b-256 a8a521e1a17121299827995c998f77507ca236a06a40841d1ab5a29c448578ba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page