Skip to main content

AI web scraping workflow.

Project description

Scraipe

pypi versions License: MIT

Scraping and analysis framework. Under development.

Features

  • Versatile Scraping: Leverage custom scrapers that handle Telegram messages, news articles, and links that require multiple ingress rules.
  • LLM Analysis: Process text using OpenAI models with built-in Pydantic validation.
  • Workflow Management: Combine scraping and analysis in a single fault-tolerant workflow--ideal for Jupyter notebooks.
  • High Performance: Asynchronous IO-bound tasks are seamlessly integrated in the synchronous API.
  • Modular: Extend the framework with new scrapers or analyzers as your data sources evolve.
  • Customizable Ingress: Easily define rules to dynamically route different links to their appropriate scrapers.
  • Detailed Logging: Monitor scraping and analysis operations through robust errors for improved debugging and transparency.

Help

See documentation for details.

Installation

Ensure you are using Python>=3.10. Install Scraipe and all built-in scrapers/analyzers:

pip install scraipe[extended]

Alternatively, install the core library with:

pip install scraipe

Example

 # Import components from scraipe
 from scraipe.defaults import TextScraper
 from scraipe.defaults import TextStatsAnalyzer
 from scraipe import Workflow

 # Initialize the scraper and analyzer
 scraper = TextScraper()
 analyzer = TextStatsAnalyzer()

 # Create the workflow instance
 workflow = Workflow(scraper, analyzer)

 # List urls to scrape
 urls = [
     "https://example.com",
     "https://rickandmortyapi.com/api/character/1",
     "https://ckaestne.github.io/seai/"
 ]

 # Run the workflow
 workflow.scrape(urls)
 workflow.analyze()

 # Print the results
 results = workflow.export()
 print(results)

Contributing

Contributions are welcome. Please open an issue or submit a pull request for improvements.

Run poetry install --with dev,docs --extras extended to install all dependences for the project.

Maintainer

This project is maintained by nibs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scraipe-0.1.55.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scraipe-0.1.55-py3-none-any.whl (38.2 kB view details)

Uploaded Python 3

File details

Details for the file scraipe-0.1.55.tar.gz.

File metadata

  • Download URL: scraipe-0.1.55.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.12.9 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for scraipe-0.1.55.tar.gz
Algorithm Hash digest
SHA256 f6da4a8290ad9f140b52ed4b08a65c71232ab8c085294a61eac89e0697db7b35
MD5 d723e4e90e198c6cc7d7e09cfb3988c7
BLAKE2b-256 77c76c6378fb978145f7a9d6bd1a42661f7ef6150f7ff3b25a1429efb84be051

See more details on using hashes here.

File details

Details for the file scraipe-0.1.55-py3-none-any.whl.

File metadata

  • Download URL: scraipe-0.1.55-py3-none-any.whl
  • Upload date:
  • Size: 38.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.12.9 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for scraipe-0.1.55-py3-none-any.whl
Algorithm Hash digest
SHA256 9c705c2c38cb2ac2fe534708a765931417ee2f2e36d37dd8b4ec5e2a7587ecaa
MD5 3b4ea6bd5701529bde26709d83a39feb
BLAKE2b-256 af6b1f7d31cd7ff23606f06a5236bea37dae60c6167b6706f00d597e5cd10be3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page