Skip to main content

Official Python client for Amazon Scraper API (https://amazonscraperapi.com)

Project description

amazonscraperapi-sdk

PyPI version PyPI downloads license

Official Python SDK for Amazon Scraper API. Pay only for successful (2xx) responses — non-2xx never cost you anything. Pricing starts at $0.90 per 1,000 requests on pay-as-you-go, drops to a flat $0.50 per 1,000 requests on Custom plans. 1,000 free requests on signup, no card. Drop into any Python project to fetch structured Amazon product data, run keyword searches, or queue async batches with webhook callbacks.

Benchmark (live production, 2026-04)

Measured on our own infrastructure against a 30-query mixed international set:

Metric Value
Median latency (product, US) ~2.6 s
P95 latency ~6 s
P99 latency ~10.5 s
Price / 1,000 Amazon products $0.50 flat
Concurrent threads (entry paid plan) 50
Marketplaces supported 20+
Billing unit per successful (2xx) response

Install

pip install amazonscraperapi-sdk

Requires Python >= 3.9. Built on httpx.

Quick start - single product

from amazonscraperapi import AmazonScraperAPI

asa = AmazonScraperAPI(api_key="asa_live_...")

product = asa.product(query="B09HN3Q81F", domain="com")

print(product["title"])
# "Apple AirPods Pro (2nd Generation)..."
print(product["price"]["current"])
# 199.00
print(product["rating"]["average"], product["rating"]["count"])
# 4.7 58214

Example output (trimmed)

{
    "asin": "B09HN3Q81F",
    "title": "Apple AirPods Pro (2nd Generation)...",
    "brand": "Apple",
    "price": {"current": 199.00, "currency": "USD", "was": 249.00, "savings_pct": 20},
    "rating": {"average": 4.7, "count": 58214, "distribution": {"5": 0.81, "4": 0.12}},
    "availability": "In Stock",
    "buybox": {"seller": "Amazon.com", "ships_from": "Amazon.com", "prime": True},
    "images": ["https://m.media-amazon.com/images/I/...jpg"],
    "bullets": ["Active Noise Cancellation...", "Adaptive Audio..."],
    "variants": [{"asin": "B0BDHB9Y8H", "name": "USB-C", "price": 249.00}],
    "categories": ["Electronics", "Headphones", "Earbud Headphones"],
    "_meta": {"tier": "direct", "duration_ms": 2634, "marketplace": "amazon.com"},
}

Keyword search

results = asa.search(
    query="wireless headphones",
    domain="co.uk",
    sort_by="avg_customer_review",
    pages=1,
)

for r in results["results"]:
    print(r["position"], r["asin"], r["title"], r["price"].get("current"))

Async batch (up to 1,000 ASINs with webhook callback)

batch = asa.create_batch(
    endpoint="amazon.product",
    items=[
        {"query": "B09HN3Q81F", "domain": "com"},
        {"query": "B000ALVUM6", "domain": "de", "language": "de_DE"},
        # ... up to 1,000
    ],
    webhook_url="https://your.server/webhooks/asa",
)

print("batch id:", batch["id"])
# SAVE THIS. Webhook signing secret is returned only once:
print("webhook secret:", batch["webhook_signature_secret"])

# Alternative: poll
status = asa.get_batch(batch["id"])
print(f"{status['processed_count']}/{status['total_count']}")

Verifying webhook signatures

from amazonscraperapi import verify_webhook_signature
from fastapi import FastAPI, Request, HTTPException
import json, os

app = FastAPI()

@app.post("/webhooks/asa")
async def asa_webhook(request: Request):
    raw = await request.body()
    signature = request.headers.get("X-ASA-Signature")
    if not verify_webhook_signature(signature, raw, secret=os.environ["WEBHOOK_SECRET"]):
        raise HTTPException(401, "invalid signature")
    payload = json.loads(raw)
    # process payload["results"]

What the API solves for you

Building a production-grade Amazon scraper in-house is a 2-4 week engineering project plus permanent maintenance. This SDK wraps Amazon Scraper API, which has already solved:

Pain point What we handle
Amazon CAPTCHAs / robot pages Auto-detected, retried through a heavier proxy tier (datacenter, residential, premium)
Brittle CSS selectors Extractors update as Amazon changes layouts. Your code stays the same.
20+ marketplaces amazon.de, .co.uk, .co.jp, .com.br, and more. Each with marketplace-specific parsing quirks.
Country-matched residential IPs amazon.de auto-routes through German IPs. Pass country="DE" to override.
Rotating proxies + anti-fingerprinting TLS fingerprints, headers, cookies handled server-side.
Rate-limit retries Transparent exponential backoff.
Structured JSON output Title, price, rating, reviews, variants, seller, images. Parsed, typed.
Batch/async jobs 1,000 ASINs submitted, webhook-delivered on completion.

Time saved: a greenfield Python Amazon scraper built to this spec takes roughly 80 engineer-hours (including anti-bot handling and marketplace variants). This SDK is 10 minutes.

Error handling

All failures follow a stable shape so you can match on code:

from amazonscraperapi import AmazonScraperAPI, AsaError
import time

asa = AmazonScraperAPI(api_key="asa_live_...")

try:
    product = asa.product(query="INVALID_ASIN", domain="com")
except AsaError as e:
    if e.code == "INSUFFICIENT_CREDITS":
        pass  # top up
    elif e.code == "RATE_LIMITED":
        time.sleep(e.retry_after)
    elif e.code in ("target_unreachable", "amazon-robot-or-human"):
        # non-2xx, you were not charged. safe to retry.
        pass
    else:
        raise
HTTP code When you see it Recommended action
400 INVALID_PARAMS Missing query, unsupported domain, bad sort_by Fix request; don't retry
401 INVALID_API_KEY Missing, malformed, revoked Verify ASA_API_KEY; rotate if leaked
402 INSUFFICIENT_CREDITS Balance empty Top up; renews each billing cycle
429 RATE_LIMITED Over 120 req/60s per user Honor Retry-After; retry
429 CONCURRENCY_LIMIT Over plan's parallel cap Reduce parallelism or upgrade. X-Concurrency-* headers guide backoff
502 target_unreachable Amazon down / all proxy tiers blocked Retry after 30s (already retried through 3 tiers on our side)
502 amazon-robot-or-human Amazon challenge not resolvable Retry. Often transient. Not charged
502 extraction_failed Layout we can't parse Report with X-Request-Id. Not charged
503 SERVICE_OVERLOADED Global circuit breaker Honor Retry-After: 60. Rare
500 INTERNAL_ERROR Our bug Report with X-Request-Id

Flat-credit promise: non-2xx responses are free. A basic request costs 5 credits (billing unit; end-customer price is $0.90 per 1,000 basic requests PAYG). Future JS-rendered calls will cost 15 credits. Every response has X-Request-Id for traceability — quote it in any support ticket.

Get an API key

app.amazonscraperapi.com. 1,000 free requests on signup, no credit card required.

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

amazonscraperapi_sdk-0.1.4.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

amazonscraperapi_sdk-0.1.4-py3-none-any.whl (7.7 kB view details)

Uploaded Python 3

File details

Details for the file amazonscraperapi_sdk-0.1.4.tar.gz.

File metadata

  • Download URL: amazonscraperapi_sdk-0.1.4.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for amazonscraperapi_sdk-0.1.4.tar.gz
Algorithm Hash digest
SHA256 315cd91df479beb4a24a3908dc27508bd5b6e6b3be6432575e7fe14b85e6edb0
MD5 240de38898ee45f8685d5620187f4023
BLAKE2b-256 efde3bb9b9ff5dbef6b9e202b0021b07a3500b9442046fbae98bdc2541ca4d88

See more details on using hashes here.

File details

Details for the file amazonscraperapi_sdk-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for amazonscraperapi_sdk-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 1b8f880922b421dfc266d5d6653937f13299ef29b456e0a32e54023c662e59db
MD5 7d7c81da1813877169f7f90e47f5bbd0
BLAKE2b-256 41752e675bb28f9184a8f659f3fd67d7d8cbdc38b7e4ecc8331d92ea6ca5ab14

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page