Skip to main content

ScrapeGraph Python SDK for API

Project description

🌐 ScrapeGraph Python SDK

PyPI version Python Support License Code style: black Documentation Status

ScrapeGraph API Banner

Official Python SDK for the ScrapeGraph API - Smart web scraping powered by AI.

📦 Installation

pip install scrapegraph-py

🚀 Features

  • 🤖 AI-powered web scraping and search
  • 🔄 Both sync and async clients
  • 📊 Structured output with Pydantic schemas
  • 🔍 Detailed logging
  • ⚡ Automatic retries
  • 🔐 Secure authentication

🎯 Quick Start

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

[!NOTE] You can set the SGAI_API_KEY environment variable and initialize the client without parameters: client = Client()

📚 Available Endpoints

🤖 SmartScraper

Extract structured data from any webpage or HTML content using AI.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

# Using a URL
response = client.smartscraper(
    website_url="https://example.com",
    user_prompt="Extract the main heading and description"
)

# Or using HTML content
html_content = """
<html>
    <body>
        <h1>Company Name</h1>
        <p>We are a technology company focused on AI solutions.</p>
    </body>
</html>
"""

response = client.smartscraper(
    website_html=html_content,
    user_prompt="Extract the company description"
)

print(response)
Output Schema (Optional)
from pydantic import BaseModel, Field
from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

class WebsiteData(BaseModel):
    title: str = Field(description="The page title")
    description: str = Field(description="The meta description")

response = client.smartscraper(
    website_url="https://example.com",
    user_prompt="Extract the title and description",
    output_schema=WebsiteData
)

🔍 SearchScraper

Perform AI-powered web searches with structured results and reference URLs.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

response = client.searchscraper(
    user_prompt="What is the latest version of Python and its main features?"
)

print(f"Answer: {response['result']}")
print(f"Sources: {response['reference_urls']}")
Output Schema (Optional)
from pydantic import BaseModel, Field
from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

class PythonVersionInfo(BaseModel):
    version: str = Field(description="The latest Python version number")
    release_date: str = Field(description="When this version was released")
    major_features: list[str] = Field(description="List of main features")

response = client.searchscraper(
    user_prompt="What is the latest version of Python and its main features?",
    output_schema=PythonVersionInfo
)

📝 Markdownify

Converts any webpage into clean, formatted markdown.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

response = client.markdownify(
    website_url="https://example.com"
)

print(response)

⚡ Async Support

All endpoints support async operations:

import asyncio
from scrapegraph_py import AsyncClient

async def main():
    async with AsyncClient() as client:
        response = await client.smartscraper(
            website_url="https://example.com",
            user_prompt="Extract the main content"
        )
        print(response)

asyncio.run(main())

📖 Documentation

For detailed documentation, visit docs.scrapegraphai.com

🛠️ Development

For information about setting up the development environment and contributing to the project, see our Contributing Guide.

💬 Support & Feedback

  • 📧 Email: support@scrapegraphai.com
  • 💻 GitHub Issues: Create an issue
  • 🌟 Feature Requests: Request a feature
  • ⭐ API Feedback: You can also submit feedback programmatically using the feedback endpoint:
    from scrapegraph_py import Client
    
    client = Client(api_key="your-api-key-here")
    
    client.submit_feedback(
        request_id="your-request-id",
        rating=5,
        feedback_text="Great results!"
    )
    

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🔗 Links


Made with ❤️ by ScrapeGraph AI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapegraph_py-1.9.0b7.tar.gz (112.6 kB view details)

Uploaded Source

Built Distribution

scrapegraph_py-1.9.0b7-py3-none-any.whl (15.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapegraph_py-1.9.0b7.tar.gz.

File metadata

  • Download URL: scrapegraph_py-1.9.0b7.tar.gz
  • Upload date:
  • Size: 112.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.3

File hashes

Hashes for scrapegraph_py-1.9.0b7.tar.gz
Algorithm Hash digest
SHA256 a50cb277a601370925711e25721982f9f83aec2bd484539d6444ad74b664e66a
MD5 610abb3e348be009d47776058aefc568
BLAKE2b-256 9358141d9362b1a235bf25801a441bacf1dab6a19d1953a5b7b8a7eb94aa93f9

See more details on using hashes here.

File details

Details for the file scrapegraph_py-1.9.0b7-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapegraph_py-1.9.0b7-py3-none-any.whl
Algorithm Hash digest
SHA256 9b7c416d29ebc5079a978dd5fcde41c77f61190cf535f1b848b7c4836b18b3e8
MD5 8f0feeef6c1e2f6dc600eda11f89f321
BLAKE2b-256 a76a16bb07a079f70c00b79879cec354471ed7e32b7424f433b30b764da363a3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page