Skip to main content

Provides processors for the itemloaders package, commonly used with scrapy.

Project description

Scrapy Processors

License Python Versions

Scrapy Processors is a collection of Processor classes meant to work with the itemloaders package, commonly used with the scrapy webscraping framework.

These processors are meant to extend / replace the provided processors in the itemloaders package.

Additionally the provided Processor and ProcessorCollection classes can be extended to create custom processors.

Installation

To install Scrapy Processors, simply use pip:

$ pip install scrapy-processors

Built-in Processors

Built-in Processor Collections

Table of Contents

Opening an Issue

If you encounter a problem with the project or have a feature request, you can open an issue to let us know.

To open an issue, please follow these steps:

  1. Go to the Issues tab on the github repository page.
  2. Click on the "New Issue" button.
  3. Provide a descriptive title for the issue.
  4. In the issue description, provide detailed information about the problem you are experiencing or the feature you are requesting.
  5. If applicable, include steps to reproduce the problem or any relevant code examples.
  6. Add appropriate labels to categorize the issue (e.g., bug, enhancement, documentation).
  7. Click on the "Submit new issue" button to create the issue.

Once you have opened an issue, our team will review it and provide assistance or discuss the requested feature.

Note: Before opening a new issue, please search the existing issues to see if a similar issue has already been reported. This helps avoid duplicates and allows us to focus on resolving existing problems.

Contributing

Thank you for considering contributing to this project! We welcome your contributions to help make this project better.

To contribute to this project, please follow these steps:

  1. Fork the repository by clicking on the "Fork" button at the top of the repository page. This will create a copy of the repository in your GitHub account.

  2. Clone the forked repository to your local machine using Git:

    $ git clone https://github.com/your-username/scrapy-processors.git
    
  3. Create a new branch for your changes:

    $ git checkout -b feature
    
  4. Make your desired changes to the codebase.

  5. Commit your changes with descriptive commit messages:

    $ git commit -m "Add new feature"
    
  6. Push your changes to your forked repository:

    $ git push origin feature
    
  7. Open a pull request (PR) from your forked repository to the original repository's master branch.

  8. Provide a clear and descriptive title for your PR and explain the changes you have made.

  9. Wait for the project maintainers to review your PR. You may need to make additional changes based on their feedback.

  10. Once your PR is approved, it will be merged into the main codebase. Congratulations on your contribution!

If you have any questions or need further assistance, feel free to open an issue or reach out to the project maintainers.

Happy contributing!

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_processors-2.0.5.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

scrapy_processors-2.0.5-py3-none-any.whl (28.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_processors-2.0.5.tar.gz.

File metadata

  • Download URL: scrapy_processors-2.0.5.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Darwin/23.4.0

File hashes

Hashes for scrapy_processors-2.0.5.tar.gz
Algorithm Hash digest
SHA256 295c5662e91f4f728a46f1896ca078a593d605bccdb080f71a07462f9d39bf6e
MD5 82b5f49fff4e27c81a27b0b08f3089b4
BLAKE2b-256 1f81522bc18ef90ce789404fb0b51b506951979a4e8e574fffc82d31ffc0a691

See more details on using hashes here.

File details

Details for the file scrapy_processors-2.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_processors-2.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 1f07170778b105a09d85540ed9a813c231db4bf6c34aa1e656c5e5eeb23af89c
MD5 e91d3f71c697df91bdeb95f3d0b8ac4b
BLAKE2b-256 5634304ce17d8f068550dc16c89f7d64b2cc5c221828742db43458e1bfefdbd1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page