scrapyd api
Project description
scrapyd API
Github: https://github.com/mouday/scrapyd-api
Gitee: https://gitee.com/mouday/scrapyd-api
Pypi: https://pypi.org/project/scrapyd-api
安装
pip install scrapyd-api
使用示例
# -*- coding: utf-8 -*-
from pprint import pprint
from scrapyd_api import ScrapydClient
client = ScrapydClient()
pprint(client.daemon_status())
"""
{'finished': 67,
'node_name': 'localhost',
'pending': 0,
'running': 0,
'status': 'ok',
'total': 67}
"""
简介
说明,基于scrapyd 1.2.1
进行调用,如果版本差异大,可能会出现异常
ScrapydAPI对原有的Scrapyd api进行原样返回,有利于二次开发
接口文档:https://scrapyd.readthedocs.io/en/stable/api.html
class ScrapydAPI:
add_version
cancel
delete_project
delete_version
list_jobs
list_projects
list_spiders
list_versions
schedule
daemon_status
ScrapydClient类继承自 ScrapydAPI,对其进行了扩展和加强
class ScrapydClient(ScrapydAPI):
# 加强的数据接口
daemon_status # 增加了返回参数 total
add_version # 添加version 默认值为当前时间戳 10位
list_spiders # 返回值:列表+字符串 改为 列表+字典
list_projects # 返回值:列表+字符串 改为 列表+字典
list_versions # 返回值:列表+字符串 改为 列表+字典
# 扩展的数据接口
job_status # 查询任务状态
list_versions_format # 格式化版本号为日期时间格式 '%Y-%m-%d %H:%M:%S'
list_jobs_merge # 合并后的任务列表
cancel_all_project_job # 取消所有项目下的任务
cancel_all_job # 取消指定项目下的任务
# 扩展的日志接口
logs # 获取日志-项目列表
project_logs # 获取日志-爬虫列表
spider_logs # 获取日志-任务列表
job_log # 获取job日志
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
scrapyd_api-0.0.2.tar.gz
(7.7 kB
view hashes)
Built Distribution
Close
Hashes for scrapyd_api-0.0.2-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 82c2847323ac4f7fd889bfd56e451a5b48a525038a5de64b503ca34cb7143322 |
|
MD5 | 7b66978f9ed37c6475745246e5f2b0b8 |
|
BLAKE2b-256 | 141a0b2ead0ea570e303af90701d0ebd63151e673055e6f5d192bac0262b4855 |