Skip to main content

Tools for helping build of extraction models with scrapy spiders.

Project description

e-models

Suite of tools to assist in the build of extraction models with scrapy spiders

Installation:

$ pip install e-models

scrapyutils module

scrapyutils module provides two classes, one for extending scrapy.http.TextResponse and another for extending scrapy.loader.ItemLoader. The extensions provide methods that:

  1. Allow to extract item data in the text (markdown) domain instead of the html source domain.
  2. The main purpose of this approach is the generation of datasets suitable for training transformer models for text extraction (aka extractive question answering, EQA)
  3. As a secondary objective, it provides an alternative approach to xpath and css selectors for extraction of data from the html source, that may be more suitable and readable for humans.

Usage:

Instead of subclass your item loaders from scrapy.loader.ItemLoader, use emodels.scrapyutils.ExtractItemLoader. This action will not affect the working of itemloaders and will enable the properties just described above. In addition, in order to save the collected extraction data, it is required to set the environment variable EMODELS_SAVE_EXTRACT_ITEMS to 1. The collected extraction data will be stored at <user home folder>/.datasets/items/<item class name>/<sequence number>.jl.gz. The base folder <user home folder>/.datasets is the default one. You can customize it via the environment variable EMODELS_DIR.

So, in order to maintain a clean dataset well ordered, only enable extract items saving when you are sure you have the correct extraction selectors. Then run locally:

EMODELS_SAVE_EXTRACT_ITEMS=1 scrapy crawl myspider

In addition, in order to have your dataset well ordered, you should choose the same item class name for same item schema, even accross multiple projects. And avoid to repeat it among items with different schema. However, in general you will use extraction data from all classes of items at same time in order to train a transformer model, as this is the way how transformers learn to generalize. At the end you will have a transformer model that is suited to extract any kind of item, as they are trained not to extract "data from x item" but instead to recognize and extract based on fields. So, even if you didn't train the transformer to extract a specific item class, it will do great if you trained it to extract its fields, if it already learned to extract same fields from other item classes. You only need to ask the correct question. For example, given an html page as a context, you can ask the model: which is the phone number?. You don't need to specify which kind of data (a business? a person? an organization?) you expect to find there.

(WIP...)

Project details


Release history Release notifications | RSS feed

This version

1.2.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

e-models-1.2.1.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

e_models-1.2.1-py3-none-any.whl (22.7 kB view details)

Uploaded Python 3

File details

Details for the file e-models-1.2.1.tar.gz.

File metadata

  • Download URL: e-models-1.2.1.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for e-models-1.2.1.tar.gz
Algorithm Hash digest
SHA256 9ea1a670189372b8ab02b1d82b5c19ddc782699a368d4f0f528239e52f6153cd
MD5 1cf930e61b278468ccd14521a85d80ec
BLAKE2b-256 f8684e7908ab0ddd80fc39843d7975ccd80a562e5a709a898b906709e6bdaf1f

See more details on using hashes here.

File details

Details for the file e_models-1.2.1-py3-none-any.whl.

File metadata

  • Download URL: e_models-1.2.1-py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for e_models-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 95c6adf6d1ae717c23aae8dabbb844f62d72c035777c1c119fa685f8c2069f8d
MD5 2795aed2b2f07dd1a5983cc4e514750a
BLAKE2b-256 dd2b7e6616b63165d4267145521c3f253355b19b6859ece3d8667292f4501303

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page