Skip to main content

Advanced Python utilities for time and data processing

Project description

变更日志

[0.7.0] - wechatauto包

变更

  • 添加了新功能,与0.6.7旧版本可能不太兼容
  • 1.根据关键字,自动化搜索对应视频,按顺序点开,然后查看带评论关键字comment_key,并进行回复随机话comment_list和关注,继续下一个用户, 回调函数handle_single_comment也只会返回符合带评论关键字comment_key的评论,不会返回所有评论
  • 2.更新了后台点击功能,0.6.7还是前台点击功能
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(back_click_model=True)
# 3. 执行爬取!一句话搞定!
crawler.set_comment_callback(handle_single_comment)
video_list_data = crawler.crawl("关键字",comment_key=["评论中的关键字","评论中的关键字2","评论中的关键字3"],skip_comment=True,comment_list=["回复话1","回复话2","回复话3"])
print("video_list_data",video_list_data)

[0.6.7] - wechatauto包

新增

  • 1.根据关键字,自动化搜索对应视频,按顺序点开,并获取所有评论,有回调函数handle_single_comment可自行处理后续的评论
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(scroll_video_comment_time=0.5)
# 3. 执行爬取!一句话搞定!
crawler.set_comment_callback(handle_single_comment)
video_list_data = crawler.crawl("关键字",skip_comment=True)
print("video_list_data",video_list_data)
#获取视频列表数据
  • 2.根据关键字,自动化搜索对应视频,按顺序点开,不获取评论,只获取视频的详情
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(scroll_video_comment_time=0.5)
# 3. 执行爬取!一句话搞定!
video_list_data = crawler.crawl("关键字")
print("video_list_data",video_list_data)
#获取视频列表数据

变更

  • 优化了某项功能的性能。

修复

  • 修复了某个具体问题。 ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bruce_li_tc-0.7.0.tar.gz (114.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bruce_li_tc-0.7.0-py3-none-any.whl (137.7 kB view details)

Uploaded Python 3

File details

Details for the file bruce_li_tc-0.7.0.tar.gz.

File metadata

  • Download URL: bruce_li_tc-0.7.0.tar.gz
  • Upload date:
  • Size: 114.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for bruce_li_tc-0.7.0.tar.gz
Algorithm Hash digest
SHA256 3bed512b9f0fb9854be6e7d84cf1b3900c0927a76e52b6b1d8c1ccd0aa20890e
MD5 bb34b14771e1261fa5eeee91ef3f17f9
BLAKE2b-256 7bf46eeeb85f5a12bdc8050b91c118079b5bd62773b32257ecdc6b472c5a2bed

See more details on using hashes here.

File details

Details for the file bruce_li_tc-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: bruce_li_tc-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 137.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for bruce_li_tc-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f0e7703c5e32066b461ea0019691472cc2ef817fcfb3bef32e706d66f5afffe5
MD5 a9282cd4bb33320b3c013fd9a3b5e8d9
BLAKE2b-256 0c7b55f0ea3bc93325c2369f2c099249bb718322c174dd61bc0df2bfe0c2b92e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page