Skip to main content

An SEO tool that analyzes the structure of a site, crawls the site, count words in the body of the site and warns of any technical SEO issues.

Project description

Python SEO and GEO Analyzer

PyPI version Docker Pulls

A modern SEO and GEO (Generative AI Engine Optimization or better AI Search Optimization) analysis tool that combines technical optimization and authentic human value. Beyond traditional site crawling and structure analysis, it uses AI to evaluate content's expertise signals, conversational engagement, and cross-platform presence. It helps you maintain strong technical foundations while ensuring your site demonstrates genuine authority and value to real users.

The AI features were heavily influenced by the clickbait-titled SEL article A 13-point roadmap for thriving in the age of AI search.

Note About Python

I've written quite a bit about the speed of Python and how there are very specific use cases where it isn't the best choice. I feel like crawling websites is definitely one of those cases. I wrote this tool in Python around 2010 to solve thea very specific need of crawling some small HTML-only websites for startups I was working at. I'm excited to see how much it has grown and how many people are using it. I feel like Python SEO Analyzer is acceptable for most smaller use cases, but if you are looking for something faster, I've built a much faster and more comprehensive tool Black SEO Analyzer.

Installation

PIP

pip install pyseoanalyzer

Docker

Using the Pre-built Image from Docker Hub

The easiest way to use the Docker image is to pull it directly from Docker Hub.

# Pull the latest image
docker pull sethblack/python-seo-analyzer:latest

# Run the analyzer (replace example.com with the target URL)
# The --rm flag automatically removes the container when it exits
docker run --rm sethblack/python-seo-analyzer http://example.com/

# Run with specific arguments (e.g., sitemap and HTML output)
# Note: If the sitemap is local, you'll need to mount it (see mounting example below)
docker run --rm sethblack/python-seo-analyzer http://example.com/ --sitemap /path/inside/container/sitemap.xml --output-format html

# Run with AI analysis (requires ANTHROPIC_API_KEY)
# Replace "your_api_key_here" with your actual Anthropic API key
docker run --rm -e ANTHROPIC_API_KEY="your_api_key_here" sethblack/python-seo-analyzer http://example.com/ --run-llm-analysis

# Save HTML output to your local machine
# This mounts the current directory (.) into /app/output inside the container.
# The output file 'results.html' will be saved in your current directory.
# The tool outputs JSON by default to stdout, so we redirect it for HTML.
# Since the ENTRYPOINT handles the command, we redirect the container's stdout.
# We need a shell inside the container to handle the redirection.
docker run --rm -v "$(pwd):/app/output" sethblack/python-seo-analyzer /bin/sh -c "seoanalyze http://example.com/ --output-format html > /app/output/results.html"
# Note for Windows CMD users: Use %cd% instead of $(pwd)
# docker run --rm -v "%cd%:/app/output" sethblack/python-seo-analyzer /bin/sh -c "seoanalyze http://example.com/ --output-format html > /app/output/results.html"
# Note for Windows PowerShell users: Use ${pwd} instead of $(pwd)
# docker run --rm -v "${pwd}:/app/output" sethblack/python-seo-analyzer /bin/sh -c "seoanalyze http://example.com/ --output-format html > /app/output/results.html"


# Mount a local sitemap file
# This mounts 'local-sitemap.xml' from the current directory to '/app/sitemap.xml' inside the container
docker run --rm -v "$(pwd)/local-sitemap.xml:/app/sitemap.xml" sethblack/python-seo-analyzer http://example.com/ --sitemap /app/sitemap.xml
# Adjust paths and Windows commands as needed (see volume mounting example above)

Building the Image Locally

You can also build the Docker image yourself from the source code. Make sure you have Docker installed and running.

# Clone the repository (if you haven't already)
# git clone https://github.com/sethblack/python-seo-analyzer.git
# cd python-seo-analyzer

# Build the Docker image (tag it as 'my-seo-analyzer' for easy reference)
docker build -t my-seo-analyzer .

# Run the locally built image
docker run --rm my-seo-analyzer http://example.com/

# Run with AI analysis using the locally built image
docker run --rm -e ANTHROPIC_API_KEY="your_api_key_here" my-seo-analyzer http://example.com/ --run-llm-analysis

# Run with HTML output saved locally using the built image
docker run --rm -v "$(pwd):/app/output" my-seo-analyzer /bin/sh -c "seoanalyze http://example.com/ --output-format html > /app/output/results.html"
# Adjust Windows commands as needed (see volume mounting example above)

Command-line Usage

If you run without a sitemap it will start crawling at the homepage.

seoanalyze http://www.domain.com/

Or you can specify the path to a sitmap to seed the urls to scan list.

seoanalyze http://www.domain.com/ --sitemap path/to/sitemap.xml

HTML output can be generated from the analysis instead of json.

seoanalyze http://www.domain.com/ --output-format html

API

The analyze function returns a dictionary with the results of the crawl.

from pyseoanalyzer import analyze

output = analyze(site, sitemap)

print(output)

In order to analyze heading tags (h1-h6) and other extra additional tags as well, the following options can be passed to the analyze function

from pyseoanalyzer import analyze

output = analyze(site, sitemap, analyze_headings=True, analyze_extra_tags=True)

print(output)

By default, the analyze function analyzes all the existing inner links as well, which might be time consuming. This default behaviour can be changed to analyze only the provided URL by passing the following option to the analyze function

from pyseoanalyzer import analyze

output = analyze(site, sitemap, follow_links=False)

print(output)

Alternatively, you can run the analysis as a script from the seoanalyzer folder.

python -m seoanalyzer https://www.sethserver.com/ -f html > results.html

AI Optimization

The first pass of AI optimization features use Anthropic's claude-3-sonnet-20240229 model to evaluate the content of the site. You will need to have an API key from Anthropic to use this feature. The API key needs to be set as the environment variable ANTHROPIC_API_KEY. I recommend using a .env file to set this variable. Once the API key is set, the AI optimization features can be enabled with the --run-llm-analysis flag.

Notes

If you get requests.exceptions.SSLError at either the command-line or via the python-API, try using:

instead of..

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyseoanalyzer-2025.4.3.tar.gz (23.5 kB view details)

Uploaded Source

Built Distribution

pyseoanalyzer-2025.4.3-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file pyseoanalyzer-2025.4.3.tar.gz.

File metadata

  • Download URL: pyseoanalyzer-2025.4.3.tar.gz
  • Upload date:
  • Size: 23.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pyseoanalyzer-2025.4.3.tar.gz
Algorithm Hash digest
SHA256 0ac9b9e86a4161d3132978190921bb64b0ae34c644f687179b5ab359f4874a0c
MD5 162029d94c729fbd60c8f4332b6d66ed
BLAKE2b-256 e972af13bea9219a5fd9988f4cf556adae339d88b20d850dc6192091c70f5ff8

See more details on using hashes here.

File details

Details for the file pyseoanalyzer-2025.4.3-py3-none-any.whl.

File metadata

File hashes

Hashes for pyseoanalyzer-2025.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b889e22b6d6dd69b140fc2b5e8f2be2fdde1b84dc03b28d8865899264f003009
MD5 32b089d6d5552e86c51a0dcb76c5eb80
BLAKE2b-256 2ef8b899cc916e4aa4e291780d2a439315155401b03b3643d9d70ac76d571f7a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page