Skip to main content

ScrapeGraph Python SDK for API

Project description

🌐 ScrapeGraph Python SDK

PyPI version Python Support License Code style: black Documentation Status

ScrapeGraph API Banner

Official Python SDK for the ScrapeGraph API - Smart web scraping powered by AI.

📦 Installation

pip install scrapegraph-py

🚀 Features

  • 🤖 AI-powered web scraping
  • 🔄 Both sync and async clients
  • 📊 Structured output with Pydantic schemas
  • 🔍 Detailed logging
  • ⚡ Automatic retries
  • 🔐 Secure authentication

🎯 Quick Start

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

[!NOTE] You can set the SGAI_API_KEY environment variable and initialize the client without parameters: client = Client()

📚 Available Endpoints

🔍 SmartScraper

Scrapes any webpage using AI to extract specific information.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

# Basic usage
response = client.smartscraper(
    website_url="https://example.com",
    user_prompt="Extract the main heading and description"
)

print(response)
Output Schema (Optional)
from pydantic import BaseModel, Field
from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

class WebsiteData(BaseModel):
    title: str = Field(description="The page title")
    description: str = Field(description="The meta description")

response = client.smartscraper(
    website_url="https://example.com",
    user_prompt="Extract the title and description",
    output_schema=WebsiteData
)

📝 Markdownify

Converts any webpage into clean, formatted markdown.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

response = client.markdownify(
    website_url="https://example.com"
)

print(response)

💻 LocalScraper

Extracts information from HTML content using AI.

from scrapegraph_py import Client

client = Client(api_key="your-api-key-here")

html_content = """
<html>
    <body>
        <h1>Company Name</h1>
        <p>We are a technology company focused on AI solutions.</p>
        <div class="contact">
            <p>Email: contact@example.com</p>
        </div>
    </body>
</html>
"""

response = client.localscraper(
    user_prompt="Extract the company description",
    website_html=html_content
)

print(response)

⚡ Async Support

All endpoints support async operations:

import asyncio
from scrapegraph_py import AsyncClient

async def main():
    async with AsyncClient() as client:
        response = await client.smartscraper(
            website_url="https://example.com",
            user_prompt="Extract the main content"
        )
        print(response)

asyncio.run(main())

📖 Documentation

For detailed documentation, visit docs.scrapegraphai.com

🛠️ Development

For information about setting up the development environment and contributing to the project, see our Contributing Guide.

💬 Support & Feedback

  • 📧 Email: support@scrapegraphai.com
  • 💻 GitHub Issues: Create an issue
  • 🌟 Feature Requests: Request a feature
  • ⭐ API Feedback: You can also submit feedback programmatically using the feedback endpoint:
    from scrapegraph_py import Client
    
    client = Client(api_key="your-api-key-here")
    
    client.submit_feedback(
        request_id="your-request-id",
        rating=5,
        feedback_text="Great results!"
    )
    

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🔗 Links


Made with ❤️ by ScrapeGraph AI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapegraph_py-1.10.0.tar.gz (110.3 kB view details)

Uploaded Source

Built Distribution

scrapegraph_py-1.10.0-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file scrapegraph_py-1.10.0.tar.gz.

File metadata

  • Download URL: scrapegraph_py-1.10.0.tar.gz
  • Upload date:
  • Size: 110.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.3

File hashes

Hashes for scrapegraph_py-1.10.0.tar.gz
Algorithm Hash digest
SHA256 7ceb6ab0e8518caab6e6f74ab9f17e7e825a9d83b4af63f52a9d59e776bcfbb7
MD5 2a09cb1e0785552ff0990ba2351ed06b
BLAKE2b-256 00f9fe860c708abf9df3acccbaa223fb0aa99f0c01eaadc52c0edeaf98f905bb

See more details on using hashes here.

File details

Details for the file scrapegraph_py-1.10.0-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapegraph_py-1.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 92b9dfcd2516c76507097d23ae64f594b8ab43d80a0ff2043f43d896b95c19ff
MD5 be72a35523d692d0cc63fa8b421b08d6
BLAKE2b-256 b3c12525fc1a736cba848c8863cfa569d34ebe3e22d6dadb3d18a4b3b02b72cf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page