Skip to main content

Advanced Python utilities for time and data processing

Project description

变更日志

[0.7.1] - wechatauto包

修复

  • 修复了导包的问题,0.7.0不可用

[0.7.0] - wechatauto包

变更

  • 添加了新功能,与0.6.7旧版本可能不太兼容
  • 1.根据关键字,自动化搜索对应视频,按顺序点开,然后查看带评论关键字comment_key,并进行回复随机话comment_list和关注,继续下一个用户, 回调函数handle_single_comment也只会返回符合带评论关键字comment_key的评论,不会返回所有评论
  • 2.更新了后台点击功能,0.6.7还是前台点击功能
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(back_click_model=True)
# 3. 执行爬取!一句话搞定!
crawler.set_comment_callback(handle_single_comment)
video_list_data = crawler.crawl("关键字",comment_key=["评论中的关键字","评论中的关键字2","评论中的关键字3"],skip_comment=True,comment_list=["回复话1","回复话2","回复话3"])
print("video_list_data",video_list_data)

[0.6.7] - wechatauto包

新增

  • 1.根据关键字,自动化搜索对应视频,按顺序点开,并获取所有评论,有回调函数handle_single_comment可自行处理后续的评论
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(scroll_video_comment_time=0.5)
# 3. 执行爬取!一句话搞定!
crawler.set_comment_callback(handle_single_comment)
video_list_data = crawler.crawl("关键字",skip_comment=True)
print("video_list_data",video_list_data)
#获取视频列表数据
  • 2.根据关键字,自动化搜索对应视频,按顺序点开,不获取评论,只获取视频的详情
from bruce_li_tc.wechatauto.wechat_video_automator.bruce_uiauto.bruce_uiautomation import WeChatVideoCrawler,handle_single_comment
# 1. 创建爬虫管理器实例
crawler = WeChatVideoCrawler()
# 2. 初始化(启动w、定位窗口等)
crawler.initialize(scroll_video_comment_time=0.5)
# 3. 执行爬取!一句话搞定!
video_list_data = crawler.crawl("关键字")
print("video_list_data",video_list_data)
#获取视频列表数据

变更

  • 优化了某项功能的性能。

修复

  • 修复了某个具体问题。 ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bruce_li_tc-0.7.1.tar.gz (114.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bruce_li_tc-0.7.1-py3-none-any.whl (137.8 kB view details)

Uploaded Python 3

File details

Details for the file bruce_li_tc-0.7.1.tar.gz.

File metadata

  • Download URL: bruce_li_tc-0.7.1.tar.gz
  • Upload date:
  • Size: 114.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for bruce_li_tc-0.7.1.tar.gz
Algorithm Hash digest
SHA256 a51c1db869ebbc4aa43eb116500fd21fbe3bf77202206d6b2e6e0adaf4a20059
MD5 1791aff1c9f940fd4df123decbb49616
BLAKE2b-256 f44703457142fff261c75e3774787baaa2301d9d84f22926a4e78b5c622e8f5c

See more details on using hashes here.

File details

Details for the file bruce_li_tc-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: bruce_li_tc-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 137.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for bruce_li_tc-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 82912f0ca748ec170a3113a79b731339879ea0884c07be49021f2f13f9af63c0
MD5 5798812b78f53ed01d69ba10ce5bcf64
BLAKE2b-256 16c7a5f946d69038332cb6cc2053bdd9fde85ac9d71509193ef834f682774a2c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page