Skip to main content

Common Crawl import support for Meshagent datasets

Project description

Meshagent Common Crawl

Import Common Crawl captures into a Meshagent room dataset.

from meshagent.commoncrawl import import_domain_from_commoncrawl

result = await import_domain_from_commoncrawl(
    room,
    index="CC-MAIN-2025-08",
    domain="example.com",
    table="pages",
    url_filter=r"https?://(www\.)?example\.com/docs/.*",
)

To test it through meshagent room connect:

meshagent room connect --room=my-room --identity=commoncrawl -- \
  python meshagent-sdk/meshagent-commoncrawl/examples/crawl.py \
  http://www.meshagent.com --table=sample --namespace=crawls --limit=10

The example defaults to --scope=host, so https://www.example.com imports only captures from www.example.com. Use --scope=domain when you explicitly want sibling subdomains too, for example when a large site stores useful content outside www.

The sample command writes progress to stderr while it imports. TTY output uses a single updating line; redirected output uses plain log lines. Pass --silent to suppress progress output. Columnar scans emit periodic heartbeat updates while waiting for DataFusion batches. WARC reads run concurrently by default and report queued records, downloaded bytes, and request counts; use --scan-partitions to tune DataFusion scan parallelism and --concurrency, --warc-retries, and --warc-retry-delay to tune object reads.

The importer uses Common Crawl's columnar index by default through DataFusion. Basic imports generate a SQL query that selects one latest HTML capture per URL from the requested host or domain, excluding robots.txt. Advanced callers can pass columnar_sql= to control the URL selection directly; the query must return url plus WARC pointer columns (filename/offset/length or the Common Crawl names warc_filename/warc_record_offset/warc_record_length). The example CLI exposes this as --sql.

Common Crawl's CDX API is rate limited and not a good fit for broad filtering. The SDK still contains the polite CDX reader for compatibility, using https://index.commoncrawl.org, a Meshagent User-Agent, serialized/paced requests, and clearer HTTP 503 guidance.

By default, records are merged on url with the columns url, date, content_type, and text. Pass an async extract= callback to derive custom columns from the WARC record and decoded content bytes. Return None from the callback to skip the record. Pass an async progress= callback to observe import progress from library code.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meshagent_commoncrawl-0.39.9.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meshagent_commoncrawl-0.39.9-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file meshagent_commoncrawl-0.39.9.tar.gz.

File metadata

  • Download URL: meshagent_commoncrawl-0.39.9.tar.gz
  • Upload date:
  • Size: 22.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for meshagent_commoncrawl-0.39.9.tar.gz
Algorithm Hash digest
SHA256 afdfc9d16ccda06bb71b3420d63c4213ab1166836e7ae03658ebf31372bb5c93
MD5 cbd98c87470b5f36b212233b98e8ccd5
BLAKE2b-256 fca882d5a8e9e2e5ebfa99793a217c926771ed86b420b078cf94c6cc88ea3833

See more details on using hashes here.

File details

Details for the file meshagent_commoncrawl-0.39.9-py3-none-any.whl.

File metadata

File hashes

Hashes for meshagent_commoncrawl-0.39.9-py3-none-any.whl
Algorithm Hash digest
SHA256 7002af5d1a829438f7710f0c09461c88b9f9ececd28e236f3cbfe206d0937ba4
MD5 31de132d03eb7f2f534a6edf187bdda5
BLAKE2b-256 d6313e740f48b400fc27f69052700315645a50d75e1f85e217092767f194bf6a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page