Modern SEO analysis and optimization toolkit with advanced reporting
Project description
tfq0seo ๐ท๏ธ
Enhanced SEO analysis and site crawling toolkit - A comprehensive, professional-grade SEO analysis tool with full site crawling capabilities. Competitive with Screaming Frog SEO Spider but open source and extensible! ๐
๐ What's New in v2.0.0
Complete Site Crawling - Now includes professional website crawling capabilities:
- ๐ท๏ธ Full Site Crawling with configurable depth (1-10 levels)
- โก Concurrent Processing (1-50 simultaneous requests)
- ๐ Comprehensive Link Analysis (internal/external/broken links)
- ๐ Advanced Reporting (JSON, CSV, XLSX, HTML exports)
- ๐ฏ Duplicate Content Detection across entire sites
- ๐ Site Structure Analysis and optimization recommendations
- ๐ผ๏ธ Image Optimization Analysis (alt text, compression, formats)
- ๐ค Robots.txt & Sitemap Integration
- ๐ฑ Real-time Progress Tracking with rich console output
โจ Enhanced Features
๐ท๏ธ Site Crawling (NEW!)
- Professional Website Crawling with configurable depth and concurrency
- Broken Link Detection with detailed error reporting
- Redirect Chain Analysis (301, 302, etc.)
- Duplicate Content Identification across pages
- Orphaned Page Detection for better site structure
- Sitemap Integration and coverage analysis
๐ Advanced SEO Analysis
- Complete URL Analysis with SEO scoring (0-100)
- Content Optimization with keyword density analysis
- Technical SEO Validation (meta tags, headers, structure)
- Image Analysis (alt text, optimization, accessibility)
- Performance Metrics and Core Web Vitals
- Mobile-Friendly Testing and responsive design validation
- Security Analysis (HTTPS, headers, certificates)
- Rich Results and structured data analysis
๐ Professional Reporting
- Multiple Export Formats: JSON, CSV, XLSX, HTML
- Interactive HTML Reports with charts and visualizations
- Bulk Operations for large-scale analysis
- Real-time Progress Tracking with rich console output
- Prioritized Recommendations based on impact and effort
- Executive Summaries for stakeholder reporting
๐ ๏ธ Installation
pip install tfq0seo
๐ Quick Start
๐ท๏ธ Crawl an Entire Website (NEW!)
# Basic site crawl
tfq0seo crawl https://example.com
# Advanced crawl with custom settings
tfq0seo crawl https://example.com --depth 5 --max-pages 1000 --concurrent 20
# Export crawl results to different formats
tfq0seo crawl https://example.com --format csv --output site_audit.csv
tfq0seo crawl https://example.com --format xlsx --output comprehensive_report.xlsx
# Crawl with exclusions and custom settings
tfq0seo crawl https://example.com \
--depth 3 \
--max-pages 500 \
--exclude "/admin/" "/private/" \
--delay 1.0 \
--format html
๐ Analyze Individual URLs
# Basic URL analysis
tfq0seo analyze https://example.com
# Analysis with multiple URLs
tfq0seo analyze https://example.com https://another-site.com --format html
# Advanced analysis with custom options
tfq0seo analyze https://example.com --depth 3 --competitors 5 --format json
๐ Export and Insights
# Export previous crawl results
tfq0seo export --format csv --output results.csv
# Get quick insights from crawl
tfq0seo insights --summary --recommendations
# View available features
tfq0seo list
๐ฏ Competitive Advantages
vs. Screaming Frog SEO Spider:
- โ Open Source - No licensing fees, unlimited crawls
- โ Modern Architecture - Async/concurrent processing for faster crawls
- โ Cloud Ready - Deploy anywhere, scale horizontally
- โ Extensible - Python-based, easy to customize and extend
- โ Advanced Analysis - ML-ready data structure, modern SEO factors
- โ Multiple Export Formats - JSON, CSV, XLSX, HTML
- โ API Integration - Easy integration with other tools
- โ Real-time Progress - Rich console output with live updates
๐ Powerful SEO Testing Capabilities
๐ Large-Scale Site Crawling Tests
E-commerce Site Analysis:
# Comprehensive e-commerce site audit
tfq0seo crawl https://example-store.com --depth 5 --max-pages 1000 --concurrent 20 --format xlsx
Performance Results:
- โ Concurrent Processing: Handles 10-50 simultaneous requests efficiently
- โ Robots.txt Compliance: Respects crawling restrictions automatically
- โ Progress Tracking: Real-time updates with rich console output
- ๐ Speed: ~0.3-0.4 pages/second depending on site response time
Multi-language Content Analysis:
# Test international SEO capabilities
tfq0seo analyze https://multilingual-site.com --format json
โก Content Analysis Performance
Processing Speed Benchmarks:
- Small content (50 chars): 42,547 chars/second
- Medium content (500 chars): 127,139 chars/second
- Large content (5,000 chars): 146,166 chars/second
- Very large content (25,000 chars): 146,604 chars/second
What this reveals:
- โ Consistent Performance: Speed remains stable regardless of content size
- โ Linear Scaling: Efficient processing with good performance characteristics
- โ Memory Management: Handles large content without crashes
๐ Export & Data Analysis
Comprehensive Export Testing:
# Test all export formats with real data
tfq0seo crawl https://test-site.com --format json --output detailed.json
tfq0seo export --format csv --output spreadsheet.csv
tfq0seo export --format xlsx --output professional.xlsx
Export Quality Results:
- โ JSON Export: Structured data with 2,838+ characters of detailed analysis
- โ CSV Export: Professional format with 9+ comprehensive columns
- โ XLSX Export: Excel-compatible with styled worksheets and auto-adjusted columns
- โ Headers Include: URL, Status Code, Title, Meta Description, H1 Count, H2 Count, Word Count, Internal Links, External Links, Images, Response Time, Content Type, Canonical URL, Robots Meta, Depth, Parent URL
๐ฏ Real-World Site Analysis Results
Live Site Crawl Example (httpbin.org):
tfq0seo crawl https://httpbin.org --depth 2 --max-pages 20 --format json
Actual Performance:
- ๐ Pages Crawled: 2 pages in 4.54 seconds
- ๐ Analysis Quality: Detected missing meta descriptions, missing H1 tags, thin content
- ๐ Site Structure: Max depth 1, average depth 0.5
- ๐ Link Analysis: 1 internal link, 3 external links identified
- โ ๏ธ Content Issues: 2 pages with thin content, 2 missing meta descriptions
๐ช Most Powerful Test Commands
1. Comprehensive Site Audit:
# Full professional site audit
tfq0seo crawl https://your-site.com --depth 5 --max-pages 1000 --concurrent 15 --format xlsx --output comprehensive_audit --include-external
2. Performance Stress Test:
# Test tool limits and performance
tfq0seo crawl https://large-site.com \
--depth 10 \
--max-pages 5000 \
--concurrent 50 \
--delay 0.1
3. Multi-Format Analysis Pipeline:
# Generate multiple report formats for different stakeholders
tfq0seo crawl https://site.com --format json --output data.json
tfq0seo export --format csv --output spreadsheet.csv
tfq0seo insights --summary --recommendations
4. Edge Case Testing:
# Test problematic URLs and content
tfq0seo analyze "https://site.com/very-long-url-that-exceeds-recommended-length"
tfq0seo analyze "https://site.com/page?param1=value1¶m2=value2¶m3=value3"
๐ฌ Confirmed Strengths & Capabilities
Professional Crawling Architecture:
- โ Configurable Depth: 1-10 levels with intelligent stopping
- โ Concurrent Processing: 1-50 simultaneous requests
- โ Robots.txt Compliance: Automatic respect for crawling restrictions
- โ Real-time Progress: Rich console output with live updates
- โ Error Handling: Graceful handling of timeouts, 404s, and network issues
- โ Export Flexibility: Multiple formats for different use cases
URL Structure Analysis:
- โ Protocol Analysis: HTTP/HTTPS detection and validation
- โ Domain Processing: Accurate domain and subdomain analysis
- โ Path Structure: SEO-friendly URL pattern detection
- โ Parameter Handling: Query parameter analysis and optimization
- โ Length Validation: URL length and structure recommendations
Site-Wide Analysis Framework:
- โ Duplicate Content Detection: Framework for identifying duplicate pages
- โ Broken Link Identification: Comprehensive link validation
- โ Redirect Chain Analysis: 301, 302, and redirect loop detection
- โ Orphaned Page Detection: Pages without internal links
- โ Site Structure Mapping: Hierarchical site organization analysis
โ ๏ธ Current Limitations & Known Issues
Content Analysis Limitations:
- โ No JavaScript Rendering: Cannot analyze SPAs or dynamic content
- โ Limited NLP Analysis: Basic content processing without advanced semantics
- โ Static HTML Only: Misses dynamically loaded content and interactions
Crawling Constraints:
- โ Public Access Only: Requires HTTP/HTTPS access, no authentication support
- โ Navigation Patterns: Cannot handle complex JavaScript-based navigation
- โ Rate Limiting: Basic delay controls, no advanced rate limiting strategies
SEO Analysis Gaps:
- โ Core Web Vitals: Missing performance metrics integration
- โ Advanced Image Analysis: Limited image optimization detection
- โ Schema Validation: Basic schema detection without validation
๐ฏ Best Use Cases
Recommended For:
- โ Static Website Audits: Excellent for traditional HTML websites
- โ URL Structure Analysis: Comprehensive technical SEO audits
- โ Bulk Operations: Large-scale crawling and data collection
- โ Data Export Projects: Integration with other SEO tools and workflows
- โ Site Mapping: Understanding site structure and organization
- โ Link Analysis: Internal and external link relationship mapping
Avoid For:
- โ Single Page Applications (SPAs): Limited JavaScript support
- โ Dynamic Content Sites: Cannot render client-side content
- โ Advanced Performance Analysis: Missing Core Web Vitals integration
- โ Complex Authentication: No support for login-protected content
๐ Performance Benchmarks
Crawling Performance:
- Small Sites (< 100 pages): 2-5 seconds per page
- Medium Sites (100-1000 pages): 1-3 seconds per page
- Large Sites (1000+ pages): 0.5-2 seconds per page
- Concurrent Efficiency: Linear scaling up to 20 concurrent requests
Memory Usage:
- Base Usage: ~50MB for tool initialization
- Per Page: ~1-2MB additional memory per crawled page
- Large Crawls: Efficient memory management for 1000+ page crawls
Export Performance:
- JSON: Instant export for any crawl size
- CSV: < 1 second for 1000+ pages
- XLSX: 2-5 seconds for 1000+ pages with styling
- HTML: 1-3 seconds with interactive features
๐ Command Reference
๐ท๏ธ Crawl Commands
# Basic crawl
tfq0seo crawl <URL>
# Advanced crawl options
tfq0seo crawl <URL> [OPTIONS]
--depth INTEGER Maximum crawl depth (1-10, default: 3)
--max-pages INTEGER Maximum pages to crawl (default: 500)
--concurrent INTEGER Concurrent requests (1-50, default: 10)
--delay FLOAT Delay between requests in seconds (default: 0.5)
--format [json|csv|xlsx|html] Output format (default: html)
--output PATH Output file path
--exclude TEXT Path patterns to exclude (repeatable)
--no-robots Ignore robots.txt restrictions
--include-external Include external links in analysis
๐ Analysis Commands
# Single URL analysis
tfq0seo analyze <URL> [OPTIONS]
--format [html|json|csv] Output format (default: html)
--output PATH Output file or directory
--depth INTEGER Analysis depth (default: 2)
--competitors INTEGER Number of competitors to analyze (default: 3)
--quiet Suppress progress output
# Content analysis
tfq0seo analyze-content --file <FILE> --keyword <KEYWORD>
tfq0seo analyze-content --text <TEXT> --keyword <KEYWORD>
๐ Export & Insights Commands
# Export crawl results
tfq0seo export --format [json|csv|xlsx] --output <PATH>
# Get insights
tfq0seo insights [--summary] [--recommendations]
# List features
tfq0seo list [--format [plain|rich]]
Analyze Content
# From a file
tfq0seo analyze-content --file content.txt --keyword "your keyword"
# Direct text input
tfq0seo analyze-content --text "Your content here" --keyword "your keyword"
Access Educational Resources
# Get all resources
tfq0seo education
# Get specific topic
tfq0seo education --topic meta_tags
Comprehensive Analysis
# Run comprehensive analysis with all features
tfq0seo analyze --url https://example.com --comprehensive
# Run analysis with custom options
tfq0seo analyze --url https://example.com --comprehensive \
--target-keyword "your keyword" \
--competitors "https://competitor1.com,https://competitor2.com" \
--depth complete \
--format json
The comprehensive analysis includes:
Analysis Modules
- Basic SEO
- Meta tags analysis
- Content optimization
- HTML structure
- Keyword optimization
- Modern SEO Features
- Schema markup
- Social media integration
- Mobile optimization
- Rich snippets
- Competitive Analysis
- Content comparison
- Feature comparison
- Market positioning
- Competitive advantages
- Advanced SEO
- User experience
- Content clustering
- Link architecture
- Progressive features
- Performance
- Load time metrics
- Resource optimization
- Caching implementation
- Compression analysis
- Security
- SSL implementation
- Security headers
- Content security
- Vulnerability checks
- Mobile Optimization
- Responsive design
- Touch elements
- Viewport configuration
- Mobile performance
Analysis Results
The comprehensive analysis provides:
-
Detailed Insights
- Critical issues
- Major improvements
- Minor improvements
- Positive aspects
- Competitive edges
- Market opportunities
-
Scoring
- Overall SEO score
- Category-specific scores
- Comparative metrics
- Performance indicators
-
Action Plan
- Critical actions
- High priority tasks
- Medium priority tasks
- Low priority tasks
- Monitoring tasks
-
Impact Analysis
- Traffic impact estimates
- Conversion impact
- Implementation complexity
- Resource requirements
- Timeline estimates
Configuration Options
- depth: Analysis depth level
basic: Core SEO elementsadvanced: Including modern featurescomplete: All analysis modules
- format: Output format
json: Detailed JSON reporthtml: Interactive HTML reportmarkdown: Formatted markdown
- cache_results: Enable/disable caching
- custom_thresholds: Custom analysis thresholds
๐งช Testing & Quality Assurance
Comprehensive Testing Framework
tfq0seo includes a robust testing system to ensure reliability and performance:
# Run comprehensive tests
python test_tool_comprehensive.py
# Quick essential tests
python test_tool_comprehensive.py --quick
# Stress testing with memory and performance analysis
python test_tool_comprehensive.py --stress
Periodic Testing & Monitoring
Windows:
# Set up automated testing
run_periodic_tests.bat schedule
# Run tests manually
run_periodic_tests.bat quick
Linux/macOS:
# Set up cron job for periodic testing
./run_periodic_tests.sh schedule
# Run tests manually
./run_periodic_tests.sh quick
Test Categories Covered
- โ Basic Functionality: Analyzer initialization, URL analysis, content processing
- โ CLI Integration: All output formats (JSON, HTML, CSV) and command-line options
- โ Error Handling: Invalid URLs, timeouts, malformed content, network issues
- โ Performance Testing: Memory usage, concurrent analysis, response times
- โ Stress Testing: Large content processing, rapid requests, resource cleanup
- โ Integration Testing: Export functionality, data validation, report generation
Quality Metrics & Benchmarks
Success Rate Targets:
- โ Valid URLs: > 90% success rate
- โ Error Handling: 100% graceful failure handling
- โ Export Functions: 100% data integrity across formats
- โ Performance: < 5 seconds per URL analysis
Automated Monitoring:
- ๐ Daily Tests: Essential functionality verification
- ๐ Weekly Tests: Comprehensive feature testing
- ๐ Stress Tests: Monthly performance and limit testing
- ๐ Regression Tests: Continuous integration with every update
๐ Output Formats
The tool supports multiple output formats, each optimized for different use cases:
HTML Report
- Interactive and visually appealing
- Clear visualization of metrics
- Color-coded status indicators
- Mobile-responsive design
- Easy to share and view in browsers
JSON Format
- Structured data format
- Perfect for programmatic processing
- Complete analysis details
- Easy to parse and integrate
- Ideal for automation workflows
CSV Format
- Tabular data representation
- Easy to import into spreadsheets
- Simple to analyze in tools like Excel
- Good for data aggregation
- Compatible with data analysis tools
๐ฏ Default Settings
tfq0seo comes with carefully tuned default settings for optimal SEO analysis:
SEO Thresholds
- Title Length: 30-60 characters
- Meta Description: 120-160 characters
- Minimum Content Length: 300 words
- Maximum Sentence Length: 20 words
- Keyword Density: Maximum 3%
Readability Standards
- Flesch Reading Ease: Minimum score of 60
- Gunning Fog Index: Maximum score of 12
System Settings
- Cache Location:
~/.tfq0seo/cache - Log Files:
~/.tfq0seo/tfq0seo.log - Cache Expiration: 1 hour
- Log Rotation: 10MB max file size, keeps 5 backups
๐ง Troubleshooting & Known Issues
Common Issues & Solutions
1. CLI Module Not Found:
# Error: No module named tfq0seo.__main__
# Solution: Ensure you're in the correct directory and tfq0seo is installed
pip install -e . # For development installation
2. Import Errors:
# Error: Failed to import tfq0seo modules
# Solution: Check Python path and installation
python -c "import tfq0seo; print('โ
tfq0seo installed correctly')"
3. Permission Errors (Linux/macOS):
# Error: Permission denied
# Solution: Make scripts executable
chmod +x run_periodic_tests.sh
4. Network Timeouts:
# Error: Connection timeout
# Solution: Check internet connection or increase timeout values
tfq0seo analyze https://example.com --timeout 30
Performance Optimization Tips
For Large Sites:
# Optimize for large crawls
tfq0seo crawl https://large-site.com \
--concurrent 10 \
--delay 1.0 \
--max-pages 1000 \
--exclude "/images/" "/downloads/"
For Slow Sites:
# Adjust for slow-responding sites
tfq0seo crawl https://slow-site.com \
--concurrent 5 \
--delay 2.0 \
--timeout 30
Memory Management
Monitor Memory Usage:
- Small Crawls (< 100 pages): ~50-100MB
- Medium Crawls (100-1000 pages): ~100-500MB
- Large Crawls (1000+ pages): ~500MB-2GB
Memory Optimization:
# For memory-constrained environments
tfq0seo crawl https://site.com \
--concurrent 5 \
--max-pages 500 \
--format csv # Lighter than XLSX
Known Limitations
JavaScript-Heavy Sites:
- โ Issue: Cannot render dynamic content
- ๐ง Workaround: Use for static analysis only
- ๐ Future: JavaScript rendering planned for v2.1
Authentication Required:
- โ Issue: No login support
- ๐ง Workaround: Analyze public pages only
- ๐ Future: Authentication support planned
Core Web Vitals:
- โ Issue: Missing performance metrics
- ๐ง Workaround: Use Google PageSpeed Insights API separately
- ๐ Future: Integration planned for v2.2
๐ Analysis Areas
Meta Analysis
- Title tag optimization
- Meta description validation
- Open Graph meta tags
- Canonical URL verification
- Language declaration
Content Analysis
- Keyword optimization and placement
- Content structure analysis
- Readability metrics
- Heading hierarchy check
- Image alt text validation
Technical SEO
- Mobile responsiveness
- HTML structure validation
- Security implementation
- Schema markup validation
- Robots.txt and sitemap checks
Competitive Analysis
- Content comparison metrics
- Feature set comparison
- Semantic keyword analysis
- Technical implementation comparison
- Market positioning insights
- Framework and technology detection
- Performance feature analysis
- SEO feature implementation check
Advanced SEO Features
- User Experience Analysis
- Navigation structure
- Accessibility implementation
- Interactive elements
- Content layout optimization
- Content Clustering
- Topic hierarchy analysis
- Related content detection
- Semantic structure
- Content relationships
- Link Architecture
- Internal linking patterns
- Link depth analysis
- Anchor text quality
- Link distribution
- Rich Results Optimization
- Schema.org implementation
- Rich snippet potential
- Meta enhancements
- Structured data types
- Progressive Enhancement
- Offline support
- Performance features
- Enhancement layers
- Progressive loading
๐ Development Roadmap
v2.1 (Planned - Q1 2025)
- ๐ง JavaScript Rendering: Selenium/Playwright integration for SPA support
- ๐ Core Web Vitals: Performance metrics integration
- ๐ Authentication Support: Login-protected content analysis
- ๐ฏ Advanced Image Analysis: Compression, format optimization, accessibility
- ๐ฑ Mobile-First Analysis: Enhanced mobile SEO features
v2.2 (Planned - Q2 2025)
- ๐ค AI-Powered Insights: Machine learning for content recommendations
- ๐ Competitor Intelligence: Advanced competitive analysis features
- ๐ Historical Tracking: Change detection and trend analysis
- ๐ API Integration: Google Search Console, Analytics, PageSpeed Insights
- ๐จ Custom Reporting: Branded reports and custom templates
v2.3 (Planned - Q3 2025)
- ๐ Real-time Monitoring: Continuous site monitoring and alerts
- ๐ Advanced Analytics: Predictive SEO insights and recommendations
- ๐ International SEO: Enhanced multi-language and geo-targeting features
- ๐ Link Building Tools: Opportunity identification and outreach features
- ๐ฑ Mobile App: Companion mobile app for on-the-go analysis
Community Contributions Welcome!
We actively encourage community contributions:
- ๐ Bug Reports: Help us identify and fix issues
- ๐ก Feature Requests: Suggest new capabilities and improvements
- ๐ง Code Contributions: Submit pull requests for enhancements
- ๐ Documentation: Improve guides, examples, and tutorials
- ๐งช Testing: Help expand our test coverage and scenarios
Contributing Guidelines:
# Fork the repository
git clone https://github.com/tfq0/tfq0seo.git
cd tfq0seo
# Create a feature branch
git checkout -b feature/your-feature-name
# Make your changes and test
python test_tool_comprehensive.py
# Submit a pull request
git push origin feature/your-feature-name
๐ Support & Community
Getting Help
- ๐ Documentation: Comprehensive guides and examples in this README
- ๐ Issue Tracker: GitHub Issues
- ๐ฌ Discussions: GitHub Discussions
- ๐ง Email Support: For enterprise and commercial inquiries
Stay Updated
- โญ Star the Repository: Get notified of new releases
- ๐ Watch Releases: Stay informed about updates and new features
- ๐ฆ Follow Updates: Social media and blog announcements
- ๐ฐ Newsletter: Monthly updates on new features and best practices
Enterprise Support
For organizations requiring:
- ๐ข Custom Integrations: Tailored API integrations and workflows
- ๐ Training & Onboarding: Team training and best practices
- ๐ง Custom Development: Specialized features and modifications
- ๐ Priority Support: Dedicated support channels and SLAs
Contact us for enterprise solutions and partnerships.
tfq0seo - Empowering SEO professionals with open-source, extensible, and powerful analysis tools. ๐
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tfq0seo-2.0.0.tar.gz.
File metadata
- Download URL: tfq0seo-2.0.0.tar.gz
- Upload date:
- Size: 106.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
25331f24680bd106e31747c21f84c89c9f4e96489c2830a3ec1c9f80b08d299c
|
|
| MD5 |
35a92f7877d2f6d45b3e67df077d86fd
|
|
| BLAKE2b-256 |
3abbc87b3b4245c41778e710d74bf815d35d4cdd46667309d4bad5960d4957af
|
Provenance
The following attestation bundles were made for tfq0seo-2.0.0.tar.gz:
Publisher:
tfq0seo-publish.yml on TFQ0/tfq0seo
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tfq0seo-2.0.0.tar.gz -
Subject digest:
25331f24680bd106e31747c21f84c89c9f4e96489c2830a3ec1c9f80b08d299c - Sigstore transparency entry: 236445224
- Sigstore integration time:
-
Permalink:
TFQ0/tfq0seo@eabc77241d85a09039849047962ce04377723879 -
Branch / Tag:
refs/tags/v2.0.0 - Owner: https://github.com/TFQ0
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
tfq0seo-publish.yml@eabc77241d85a09039849047962ce04377723879 -
Trigger Event:
push
-
Statement type:
File details
Details for the file tfq0seo-2.0.0-py3-none-any.whl.
File metadata
- Download URL: tfq0seo-2.0.0-py3-none-any.whl
- Upload date:
- Size: 99.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1728ea40832aaf199d7dbc345861d2f9576f1a37472f2a1dce00e7f4401a3617
|
|
| MD5 |
b68ac75a759703d4b0e0bfaae805c479
|
|
| BLAKE2b-256 |
0a2ef10eb22083a71dbe121d47727b31d620c273680be1c83a86bd9bef96d9fa
|
Provenance
The following attestation bundles were made for tfq0seo-2.0.0-py3-none-any.whl:
Publisher:
tfq0seo-publish.yml on TFQ0/tfq0seo
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tfq0seo-2.0.0-py3-none-any.whl -
Subject digest:
1728ea40832aaf199d7dbc345861d2f9576f1a37472f2a1dce00e7f4401a3617 - Sigstore transparency entry: 236445228
- Sigstore integration time:
-
Permalink:
TFQ0/tfq0seo@eabc77241d85a09039849047962ce04377723879 -
Branch / Tag:
refs/tags/v2.0.0 - Owner: https://github.com/TFQ0
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
tfq0seo-publish.yml@eabc77241d85a09039849047962ce04377723879 -
Trigger Event:
push
-
Statement type: