Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (pypi.python.org).
Help us improve Python packaging - Donate today!

Web Scraping Framework

Project Description

What is Grab?

Grab is a python web scraping framework. Grab provides a number of helpful methods to perform network requests, scrape web sites and process the scraped content:

  • Automatic cookies (session) support
  • HTTP and SOCKS proxy with/without authorization
  • Keep-Alive support
  • IDN support
  • Tools to work with web forms
  • Easy multipart file uploading
  • Flexible customization of HTTP requests
  • Automatic charset detection
  • Powerful API to extract data from DOM tree of HTML documents with XPATH queries
  • Asynchronous API to make thousands of simultaneous queries. This part of library called Spider. See list of spider fetures below.
  • Python 3 ready

Spider is a framework for writing web-site scrapers. Features:

  • Rules and conventions to organize the request/parse logic in separate blocks of codes
  • Multiple parallel network requests
  • Automatic processing of network errors (failed tasks go back to task queue)
  • You can create network requests and parse responses with Grab API (see above)
  • HTTP proxy support
  • Caching network results in permanent storage
  • Different backends for task queue (in-memory, redis, mongodb)
  • Tools to debug and collect statistics

Grab Example

import logging

from grab import Grab

logging.basicConfig(level=logging.DEBUG)

g = Grab()

g.go('https://github.com/login')
g.doc.set_input('login', '****')
g.doc.set_input('password', '****')
g.doc.submit()

g.doc.save('/tmp/x.html')

g.doc('//ul[@id="user-links"]//button[contains(@class, "signout")]').assert_exists()

home_url = g.doc('//a[contains(@class, "header-nav-link name")]/@href').text()
repo_url = home_url + '?tab=repositories'

g.go(repo_url)

for elem in g.doc.select('//h3[@class="repo-list-name"]/a'):
    print('%s: %s' % (elem.text(),
                      g.make_url_absolute(elem.attr('href'))))

Grab::Spider Example

import logging

from grab.spider import Spider, Task

logging.basicConfig(level=logging.DEBUG)


class ExampleSpider(Spider):
    def task_generator(self):
        for lang in 'python', 'ruby', 'perl':
            url = 'https://www.google.com/search?q=%s' % lang
            yield Task('search', url=url, lang=lang)

    def task_search(self, grab, task):
        print('%s: %s' % (task.lang,
                          grab.doc('//div[@class="s"]//cite').text()))


bot = ExampleSpider(thread_number=2)
bot.run()

Installation

$ pip install -U grab

See details about installing Grab on different platforms here http://docs.grablib.org/en/latest/usage/installation.html

Documentation and Help

Documentation: http://docs.grablib.org/en/latest/

Mailing list (mostly russian): http://groups.google.com/group/python-grab/

Contribution

To report a bug please use GitHub issue tracker: https://github.com/lorien/grab/issues

If you want to develop new feature in Grab please use issue tracker to describe what you want to do or contact me at lorien@lorien.name

Release History

Release History

This version
History Node

0.6.38

History Node

0.6.37

History Node

0.6.36

History Node

0.6.35

History Node

0.6.34

History Node

0.6.33

History Node

0.6.32

History Node

0.6.31

History Node

0.6.30

History Node

0.6.29

History Node

0.6.28

History Node

0.6.27

History Node

0.6.26

History Node

0.6.25

History Node

0.6.24

History Node

0.6.23

History Node

0.6.22

History Node

0.6.21

History Node

0.6.20

History Node

0.6.19

History Node

0.6.18

History Node

0.6.17

History Node

0.6.16

History Node

0.6.15

History Node

0.6.14

History Node

0.6.13

History Node

0.6.12

History Node

0.6.11

History Node

0.6.10

History Node

0.6.9

History Node

0.6.8

History Node

0.6.7

History Node

0.6.6

History Node

0.6.5

History Node

0.6.4

History Node

0.6.3

History Node

0.6.2

History Node

0.6.1

History Node

0.6.0

History Node

0.5.5

History Node

0.5.4

History Node

0.5.3

History Node

0.5.2

History Node

0.5.1

History Node

0.5.0

History Node

0.4.13

History Node

0.4.12

History Node

0.4.11

History Node

0.4.10

History Node

0.4.9

History Node

0.4.8

History Node

0.4.7

History Node

0.4.5

History Node

0.4.4

History Node

0.4.3

History Node

0.4.2

History Node

0.4.1

History Node

0.4.0

History Node

0.3.33

History Node

0.3.32

History Node

0.3.31

History Node

0.3.30

History Node

0.3.29

History Node

0.3.28

History Node

0.3.27

History Node

0.3.26

History Node

0.3.25

History Node

0.3.24

History Node

0.3.23

History Node

0.3.22

History Node

0.3.21

History Node

0.3.20

History Node

0.3.19

History Node

0.3.18

History Node

0.3.17

History Node

0.3.16

History Node

0.3.15

History Node

0.3.14

History Node

0.3.13

History Node

0.3.12

History Node

0.3.11

History Node

0.3.10

History Node

0.3.9

History Node

0.3.8

History Node

0.3.7

History Node

0.3.6

History Node

0.3.4

History Node

0.3.3

History Node

0.3.2

History Node

0.3.1

History Node

0.3

History Node

0.2.20

History Node

0.2.19

History Node

0.2.18

History Node

0.2.17

History Node

0.2.16

History Node

0.2.15

History Node

0.2.12

History Node

0.2.11

History Node

0.2.10

History Node

0.2.9

History Node

0.2.8

History Node

0.2.7

History Node

0.2.6

History Node

0.2.5

History Node

0.2.4

History Node

0.2.3

History Node

0.2.2

History Node

0.2.1

History Node

0.2.0

History Node

0.1.7

History Node

0.1.6

History Node

0.1.5

History Node

0.1.4

History Node

0.1.3

History Node

0.1.2

History Node

0.1.1

Download Files

Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
grab-0.6.38.tar.gz (1.1 MB) Copy SHA256 Checksum SHA256 Source May 17, 2017

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting