Skip to main content

Common Crawl import support for Meshagent datasets

Project description

Meshagent Common Crawl

Import Common Crawl captures into a Meshagent room dataset.

from meshagent.commoncrawl import import_domain_from_commoncrawl

result = await import_domain_from_commoncrawl(
    room,
    index="CC-MAIN-2025-08",
    domain="example.com",
    table="pages",
    url_filter=r"https?://(www\.)?example\.com/docs/.*",
)

To test it through meshagent room connect:

meshagent room connect --room=my-room --identity=commoncrawl -- \
  python meshagent-sdk/meshagent-commoncrawl/examples/crawl.py \
  http://www.meshagent.com --table=sample --namespace=crawls --limit=10

The example defaults to --scope=host, so https://www.example.com imports only captures from www.example.com. Use --scope=domain when you explicitly want sibling subdomains too, for example when a large site stores useful content outside www.

The sample command writes progress to stderr while it imports. TTY output uses a single updating line; redirected output uses plain log lines. Pass --silent to suppress progress output. Columnar scans emit periodic heartbeat updates while waiting for DataFusion batches. WARC reads run concurrently by default and report queued records, downloaded bytes, and request counts; use --scan-partitions to tune DataFusion scan parallelism and --concurrency, --warc-retries, and --warc-retry-delay to tune object reads.

The importer uses Common Crawl's columnar index by default through DataFusion. Basic imports generate a SQL query that selects one latest HTML capture per URL from the requested host or domain, excluding robots.txt. Advanced callers can pass columnar_sql= to control the URL selection directly; the query must return url plus WARC pointer columns (filename/offset/length or the Common Crawl names warc_filename/warc_record_offset/warc_record_length). The example CLI exposes this as --sql.

Common Crawl's CDX API is rate limited and not a good fit for broad filtering. The SDK still contains the polite CDX reader for compatibility, using https://index.commoncrawl.org, a Meshagent User-Agent, serialized/paced requests, and clearer HTTP 503 guidance.

By default, records are merged on url with the columns url, date, content_type, and text. Pass an async extract= callback to derive custom columns from the WARC record and decoded content bytes. Return None from the callback to skip the record. Pass an async progress= callback to observe import progress from library code.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meshagent_commoncrawl-0.39.7.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meshagent_commoncrawl-0.39.7-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file meshagent_commoncrawl-0.39.7.tar.gz.

File metadata

  • Download URL: meshagent_commoncrawl-0.39.7.tar.gz
  • Upload date:
  • Size: 22.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for meshagent_commoncrawl-0.39.7.tar.gz
Algorithm Hash digest
SHA256 f74729b59b8a766e031d037f02a2badf7df5b841cd3c8cfa042c1b4ed167d397
MD5 4db97f2a036e676844c281dd69d2e735
BLAKE2b-256 d8cc8ee39b00511f3f4f9b0155654fd29736f317568e31c67c25d1006bc35a73

See more details on using hashes here.

File details

Details for the file meshagent_commoncrawl-0.39.7-py3-none-any.whl.

File metadata

File hashes

Hashes for meshagent_commoncrawl-0.39.7-py3-none-any.whl
Algorithm Hash digest
SHA256 1065325dcd4a961ed9c90afc06061b884f9d0aa44236253a9d8a8a1350864461
MD5 07785593ba0177f147310747fae11f0a
BLAKE2b-256 54816b365fe6685b52db1247375288aee4eb1a26f6672608d5bbc1c499328d21

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page