Building a modular crawler template system based on Jinja2.
Project description
基于 Jinja2 构建模块化爬虫模板系统
- 模块化爬虫模板系统设想提出的前因后果
- 解析网页表格的两种方法
- 移除 HTML 标签获取全部文本的三种方法对比
- 解决必须执行 JS 计算 Cookie 的问题
- 记录一种特别的 POST 请求方式
- 解决 VIEWSTATE 类型的网站
本来打算就写写题目所说的,但是后来我还是决定就全放在这个项目里了,于是这个项目就变成了我一个月实习经历的经验总结。我把我很多工作中自己写的辅助用的函数工具都放到了其中,包括网页表格解析函数、文本处理函数等,另外就是记录了遇到的比较特殊的问题的解决方法。
- 安装方式:
pip install -U spider-renderer
- 简单模板文件示例:
header.tmpl
'''Rendered on {{datetime}}'''
import re
import scrapy
class NewspiderSpider(scrapy.Spider):
name = '{{spider}}'
source = '{{source}}'
url = '{{home_url}}'
author = '{{author}}'
all_page = {{all_page}}
requests.tmpl
def start_requests(self):
url = '{{page_url}}'
all_page = self.all_page or 10
for page in range(1, all_page):
yield scrapy.Request(url % page, callback=self.parse)
parser.tmpl
{% include "header.tmpl" %}
{% include "requests.tmpl" %}
def parse(self, response):
response.string = re.sub('[\r\n\t\v\f]', '', response.text)
rows = re.findall(r'''{{regex}}''', response.string)
- 渲染生成程序示例:
import os
import os.path
from renderer import genspider
basepath = os.path.abspath(os.path.dirname(__file__))
dst = os.path.join(basepath, 'spiders')
templates_folder = os.path.join(basepath, 'templates')
if not os.path.isdir(dst):
os.mkdir(dst)
templatefile = 'parser.tmpl'
spider = 'fonts_spider'
home_url = '''
http://fonts.mobanwang.com/fangzheng/
'''.strip()
page_url = '''
http://fonts.mobanwang.com/fangzheng/List_%d.html
'''.strip()
regex = r'''
href=['"](\S+?html?)['"][^<>]*?title=['"]
'''.strip()
kwargs = {
'all_page': 20,
'page_url': page_url,
'regex': regex,
'templates_folder': templates_folder,
'author': 'White Turing',
}
genspider(home_url, templatefile, dst, spider, **kwargs)
这个示例没有用到稍微复杂的 Jinja2 语法,但实际可以通过加入一些条件判断,让模板的包容性更广一点。
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
spider-renderer-0.1.7.tar.gz
(8.4 kB
view hashes)
Built Distribution
Close
Hashes for spider_renderer-0.1.7-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f2f14f227a544c60c9e80331d45cad91345630b6811aa0691c2bf2d936330f55 |
|
MD5 | ec4f958230f2c62927569cff0b981c95 |
|
BLAKE2b-256 | b4703bd738ba4aa0e264ed30c8bbba8ec78c65fa181a21f38dbabffe6db5607e |