Skip to main content

一个极简的分布式任务模块

Project description

Job Hive

PyPI version

基于Redis实现的轻量级分布式任务队列系统

🚀 功能特性

  • 支持任务推送、执行生命周期管理
  • 提供Redis队列实现(支持密码认证)
  • 上下文管理器简化资源管理
  • 支持任务批处理(示例中含单任务推送)

📦 安装依赖

目前仅支持Python3.10+,于 0.1.3 版本在原来 Redis 的基础上,加入简易版本的 Kafka 支持 Kafka 由于局限性于 0.1.8 版本移除

使用 redis

pip install job_hive[redis]

使用 kafka

pip install job_hive[kafka]

🛠️ 使用示例

from job_hive import HiveWork
from job_hive.queue import RedisQueue

with HiveWork(queue=RedisQueue(
        name="test",
        host="your_redis_host",
        password="your_password"
)) as work:
    # 使用work 对象进行任务推送提交到任务池
    jobs = [work.push(print, f"hello {i}") for i in range(5)]
    for job_id in jobs:
        print(f"Job ID: {job_id}")
    # 启动工作模式接收任务
    work.work(result_ttl=86400)  # result_ttl 参数设置结果保留时间,默认为24小时

Group 任务组示例

优化了任务组提交方式,改进大量任务产生时的性能。

from job_hive.queue import RedisQueue
from job_hive import HiveWork
from job_hive import Group

work = HiveWork(queue=RedisQueue(name="test", host='192.168.11.157', password='yunhai'))


@work.delay_task()
def hello(index):
    print('你是', index)
    raise Exception('test')


if __name__ == '__main__':
    group = Group(
        hello(1),
        hello(2),
        hello(3),
        hello(4),
        hello(5),
    )
    work.group_commit(group)
    work.work(result_ttl=30)

⚙️ 配置说明

from job_hive.queue import RedisQueue

RedisQueue(
    name="队列名称",  # 必填
    host="localhost",  # 默认localhost
    port=6379,  # 默认端口
    password=None,  # 密码(可选)
    db=0  # 数据库编号,默认为0
)

🤝 贡献指南

  1. Fork本仓库
  2. 创建特性分支(git checkout -b feature/AmazingFeature)
  3. 提交修改(git commit -m 'Add some AmazingFeature')
  4. 推送分支(git push origin feature/AmazingFeature)
  5. 发起Pull Request

📄 许可证

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

job_hive-0.1.8.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

job_hive-0.1.8-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file job_hive-0.1.8.tar.gz.

File metadata

  • Download URL: job_hive-0.1.8.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.10.18 Linux/6.11.0-1018-azure

File hashes

Hashes for job_hive-0.1.8.tar.gz
Algorithm Hash digest
SHA256 e4c0fa87dd9a78957c34e90c269517719934cbe76bbcc1fd15da9feefae232d9
MD5 1cb06102ca4a2758e95c59babe4c384d
BLAKE2b-256 17e3cf52e10e5a0cd94861f8cb7766d968eced1a38ab5b56ef83269582b0de0c

See more details on using hashes here.

File details

Details for the file job_hive-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: job_hive-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.10.18 Linux/6.11.0-1018-azure

File hashes

Hashes for job_hive-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 3b1ff8376fb9104b4457bcdb25150f6f146a80bf4691e698c3b9422cceb41782
MD5 5328c34c1c237c20a46c60959153962c
BLAKE2b-256 fb05b90cdccfd166ec754b22c30a4d7a0ecb5e26a0a556fdf853ae83c0a9ef51

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page