Skip to main content

Common Crawl import support for Meshagent datasets

Project description

Meshagent Common Crawl

Import Common Crawl captures into a Meshagent room dataset.

from meshagent.commoncrawl import import_domain_from_commoncrawl

result = await import_domain_from_commoncrawl(
    room,
    index="CC-MAIN-2025-08",
    domain="example.com",
    table="pages",
    url_filter=r"https?://(www\.)?example\.com/docs/.*",
)

To test it through meshagent room connect:

meshagent room connect --room=my-room --identity=commoncrawl -- \
  python meshagent-sdk/meshagent-commoncrawl/examples/crawl.py \
  http://www.meshagent.com --table=sample --namespace=crawls --limit=10

The example defaults to --scope=host, so https://www.example.com imports only captures from www.example.com. Use --scope=domain when you explicitly want sibling subdomains too, for example when a large site stores useful content outside www.

The sample command writes progress to stderr while it imports. TTY output uses a single updating line; redirected output uses plain log lines. Pass --silent to suppress progress output. Columnar scans emit periodic heartbeat updates while waiting for DataFusion batches. WARC reads run concurrently by default and report queued records, downloaded bytes, and request counts; use --scan-partitions to tune DataFusion scan parallelism and --concurrency, --warc-retries, and --warc-retry-delay to tune object reads.

The importer uses Common Crawl's columnar index by default through DataFusion. Basic imports generate a SQL query that selects one latest HTML capture per URL from the requested host or domain, excluding robots.txt. Advanced callers can pass columnar_sql= to control the URL selection directly; the query must return url plus WARC pointer columns (filename/offset/length or the Common Crawl names warc_filename/warc_record_offset/warc_record_length). The example CLI exposes this as --sql.

Common Crawl's CDX API is rate limited and not a good fit for broad filtering. The SDK still contains the polite CDX reader for compatibility, using https://index.commoncrawl.org, a Meshagent User-Agent, serialized/paced requests, and clearer HTTP 503 guidance.

By default, records are merged on url with the columns url, date, content_type, and text. Pass an async extract= callback to derive custom columns from the WARC record and decoded content bytes. Return None from the callback to skip the record. Pass an async progress= callback to observe import progress from library code.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meshagent_commoncrawl-0.39.6.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meshagent_commoncrawl-0.39.6-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file meshagent_commoncrawl-0.39.6.tar.gz.

File metadata

  • Download URL: meshagent_commoncrawl-0.39.6.tar.gz
  • Upload date:
  • Size: 22.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for meshagent_commoncrawl-0.39.6.tar.gz
Algorithm Hash digest
SHA256 7286235471844c009f9c290ca4cfb5b693b9aa7efc4c65ad4ed3eb80faa20dd6
MD5 445d101f377633b7636c21b5cb8134cf
BLAKE2b-256 3058240810f8b4d29143abea841a59411265bda6eb14af107f991fbde75fdc44

See more details on using hashes here.

File details

Details for the file meshagent_commoncrawl-0.39.6-py3-none-any.whl.

File metadata

File hashes

Hashes for meshagent_commoncrawl-0.39.6-py3-none-any.whl
Algorithm Hash digest
SHA256 8d07adf992d72707fb398a96804007cee2bfb397bd1cab8352a0361fc7d4defd
MD5 3fe5e6039c861b8881dd1872364d157c
BLAKE2b-256 ae97126ff48ed965e0c7888250f1854c007c3fd08853bbc06815aa529dfc1610

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page