Skip to main content

Provides processors for the itemloaders package, commonly used with scrapy.

Project description

Scrapy Processors

License Python Versions

Scrapy Processors is a collection of Processor classes meant to work with the itemloaders package, commonly used with the scrapy webscraping framework.

These processors are meant to extend / replace the provided processors in the itemloaders package.

Additionally the provided Processor and ProcessorCollection classes can be extended to create custom processors.

Installation

To install Scrapy Processors, simply use pip:

$ pip install scrapy-processors

Built-in Processors

Built-in Processor Collections

Table of Contents

What's a Processor?

What's Context?

Subclassing Processor and ProcessorCollection

Built-in ProcessorCollection Subclasses

Built-in value-by-value processors

Built-in iterable processors

Opening an Issue

If you encounter a problem with the project or have a feature request, you can open an issue to let us know.

To open an issue, please follow these steps:

  1. Go to the Issues tab on the github repository page.
  2. Click on the "New Issue" button.
  3. Provide a descriptive title for the issue.
  4. In the issue description, provide detailed information about the problem you are experiencing or the feature you are requesting.
  5. If applicable, include steps to reproduce the problem or any relevant code examples.
  6. Add appropriate labels to categorize the issue (e.g., bug, enhancement, documentation).
  7. Click on the "Submit new issue" button to create the issue.

Once you have opened an issue, our team will review it and provide assistance or discuss the requested feature.

Note: Before opening a new issue, please search the existing issues to see if a similar issue has already been reported. This helps avoid duplicates and allows us to focus on resolving existing problems.

Contributing

Thank you for considering contributing to this project! We welcome your contributions to help make this project better.

To contribute to this project, please follow these steps:

  1. Fork the repository by clicking on the "Fork" button at the top of the repository page. This will create a copy of the repository in your GitHub account.

  2. Clone the forked repository to your local machine using Git:

    $ git clone https://github.com/your-username/scrapy-processors.git
    
  3. Create a new branch for your changes:

    $ git checkout -b feature
    
  4. Make your desired changes to the codebase.

  5. Commit your changes with descriptive commit messages:

    $ git commit -m "Add new feature"
    
  6. Push your changes to your forked repository:

    $ git push origin feature
    
  7. Open a pull request (PR) from your forked repository to the original repository's master branch.

  8. Provide a clear and descriptive title for your PR and explain the changes you have made.

  9. Wait for the project maintainers to review your PR. You may need to make additional changes based on their feedback.

  10. Once your PR is approved, it will be merged into the main codebase. Congratulations on your contribution!

If you have any questions or need further assistance, feel free to open an issue or reach out to the project maintainers.

Happy contributing!

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_processors-2.0.0.tar.gz (26.9 kB view details)

Uploaded Source

Built Distribution

scrapy_processors-2.0.0-py3-none-any.whl (27.3 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_processors-2.0.0.tar.gz.

File metadata

  • Download URL: scrapy_processors-2.0.0.tar.gz
  • Upload date:
  • Size: 26.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.2.0-26-generic

File hashes

Hashes for scrapy_processors-2.0.0.tar.gz
Algorithm Hash digest
SHA256 7378c39e1a21def35c31615dce00a51d1952d04d0bc3d18dc31c28374f9e66d4
MD5 0249f1d39425838a23a4934361515ae8
BLAKE2b-256 5850a399ea59d2178cdcdb4c54f43472e4cf692e3190be94ef9d04f61705b2ce

See more details on using hashes here.

File details

Details for the file scrapy_processors-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: scrapy_processors-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 27.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.2.0-26-generic

File hashes

Hashes for scrapy_processors-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7e42ce5d08e7936f2de0c8f0b951ca3cdc863fcae257cd71d52c5681042dce04
MD5 524756ca4df788d7e8521390caa64c68
BLAKE2b-256 4043ed5bb4480afa100819c3943e427360007ed77f65e294ba569f8242ebe2be

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page