Skip to main content

No project description provided

Project description

PyCrawl

A simple crawling utility for Python

Description

This project enables site crawling and data extraction with xpath and css selectors. You can also send forms such as text data, files, and checkboxes.

Requirement

  • Python3

Usage

Description of Instance Methods

name Description
send Set the value you want to submit to the form.
submit Submit form.
css Get node by css selector.
xpath Get node by xpath.
attr Get node's attribute.
inner_text Get node's inner text.
outer_text Get node's outer text.

Simple Example

import pycrawl

url = 'http://www.example.com/'
doc = pycrawl.PyCrawl(url)

# access another url
doc.get('another url')

# get current url
doc.url

# get current site's html
doc.html

# get <table> tags as dict
doc.tables

Scraping Example

# search for nodes by css selector
# tag   : css('name')
# class : css('.name')
# id    : css('#name')
doc.css('div')
doc.css('.main-text')
doc.css('#tadjs')

# search for nodes by xpath
doc.xpath('//*[@id="top"]/div[1]')

# other example
doc.css('div').css('a')[2].attr('href') # => string object
doc.css('p').inner_text() # => string object
# You do not need to specify "[]" to access the first index

Submitting Form Example

  1. Specify target node's attribute
  2. Specify value(int or str) / check(bool) / file_name(str)
  3. call submit() with form attribute specified
# login
doc.send(id='id attribute', value='value to send')
doc.send(id='id attribute', value='value to send')
doc.submit(id='id attribute') # submit

# post file
doc.send(id='id attribute', file_name='target file name')

# checkbox
doc.send(id='id attribute', check=True)  # check
doc.send(id='id attribute', check=False) # uncheck

# example of specify other attribute
doc.send(name='name attribute', value='hello')
doc.send(class_='class attribute', value=100)
# when specifying the class attribute, please write "class_ =".

Installation

$ pip install pycrawl

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/AjxLab/PyCrawl.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for pycrawl, version 1.1.0
Filename, size File type Python version Upload date Hashes
Filename, size pycrawl-1.1.0-py3-none-any.whl (5.1 kB) File type Wheel Python version py3 Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page