Skip to main content

No project description provided

Project description


A simple crawling utility for Python


This project enables site crawling and data extraction with xpath and css selectors. You can also send forms such as text data, files, and checkboxes.


  • Python3


Description of Instance Methods

name Description
send Set the value you want to submit to the form.
submit Submit form.
css Get node by css selector.
xpath Get node by xpath.
attr Get node's attribute.
inner_text Get node's inner text.
outer_text Get node's outer text.

Simple Example

import pycrawl

url = ''
doc = pycrawl.PyCrawl(url)

# access another url
doc.get('another url')

# get current url

# get current site's html

# get <table> tags as dict

Scraping Example

# search for nodes by css selector
# tag   : css('name')
# class : css('.name')
# id    : css('#name')

# search for nodes by xpath

# other example
doc.css('div').css('a')[2].attr('href') # => string object
doc.css('p').inner_text() # => string object
# You do not need to specify "[]" to access the first index

Submitting Form Example

  1. Specify target node's attribute
  2. Specify value(int or str) / check(bool) / file_name(str)
  3. call submit() with form attribute specified
# login
doc.send(id='id attribute', value='value to send')
doc.send(id='id attribute', value='value to send')
doc.submit(id='id attribute') # submit

# post file
doc.send(id='id attribute', file_name='target file name')

# checkbox
doc.send(id='id attribute', check=True)  # check
doc.send(id='id attribute', check=False) # uncheck

# example of specify other attribute
doc.send(name='name attribute', value='hello')
doc.send(class_='class attribute', value=100)
# when specifying the class attribute, please write "class_ =".


$ pip install pycrawl


Bug reports and pull requests are welcome on GitHub at

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pycrawl-1.1.0-py3.7.egg (7.3 kB view hashes)

Uploaded Source

pycrawl-1.1.0-py3-none-any.whl (5.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page