Skip to main content

A tool to scrape YouTube community posts

Project description

YouTube Community Scraper

A Python tool to scrape posts from YouTube community tabs.

Features

  • Scrape posts from YouTube community tabs
  • Download images from posts
  • Collect post comments
  • Multi-browser support (Chromium, Firefox, WebKit)
  • Automatic browser installation
  • Proxy support (HTTP/HTTPS with auth, SOCKS5 without auth)
  • Progress saving
  • Configurable output directory

Installation

Install using pip:

pip install post-archiver

Or install from source:

git clone https://github.com/sadadYes/post-archiver.git
cd post-archiver
pip install -e .

Requirements

  • Python 3.7 or higher
  • No manual browser installation needed - browsers are installed automatically when needed

Usage

usage: post-archiver [OPTIONS] url [amount]

YouTube Community Posts Scraper

positional arguments:
  url                   YouTube channel community URL
  amount                Amount of posts to get (default: max)

options:
  -h, --help            show this help message and exit
  -c, --get-comments    Get comments from posts (WARNING: This is slow) (default: False)
  -i, --get-images      Get images from posts (default: False)
  -d, --download-images
                        Download images (requires --get-images)
  -q IMAGE_QUALITY, --image-quality IMAGE_QUALITY
                        Image quality: sd, hd, or all (default: all)
  --proxy PROXY         Proxy file or single proxy string
  -o OUTPUT, --output OUTPUT
                        Output directory (default: current directory)
  -v, --verbose         Show basic progress information
  -t, --trace          Show detailed debug information
  --browser {chromium,firefox,webkit}
                        Browser to use (default: chromium)
  --version            show program's version number and exit
  --member-only         Only get membership-only posts (requires --cookies)
  --browser-cookies {chrome,firefox,edge,opera}
                        Get cookies from browser (requires browser-cookie3)

Proxy format:
  Single proxy: <scheme>://<username>:<password>@<host>:<port>
  Proxy file: One proxy per line using the same format
  Supported schemes: http, https
  Note: SOCKS5 proxies are supported but without authentication

Amount:
  Specify number of posts to scrape (default: max)
  Use 'max' or any number <= 0 to scrape all posts

Examples:
  post-archiver https://www.youtube.com/@channel/community
  post-archiver https://www.youtube.com/@channel/community 50
  post-archiver -c -i -d -q hd https://www.youtube.com/@channel/community max
  post-archiver --browser firefox https://www.youtube.com/@channel/community
  post-archiver --proxy proxies.txt https://www.youtube.com/@channel/community 100
  post-archiver --proxy http://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy https://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy socks5://host:port https://www.youtube.com/@channel/community

Browser Support

The scraper supports three browser engines:

  • Chromium (default)
  • Firefox
  • WebKit

The appropriate browser will be automatically installed when first used. You can specify which browser to use with the --browser option.

Proxy Support

The scraper supports the following proxy types:

  • HTTP proxies with authentication
  • HTTPS proxies with authentication
  • SOCKS5 proxies (without authentication)

Note: SOCKS5 proxies with authentication are not supported due to limitations in the underlying browser automation.

Logging

Two levels of logging are available:

  • --verbose (-v): Shows basic progress information
  • --trace (-t): Shows detailed debug information including browser console messages

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_archiver-1.2.0.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

post_archiver-1.2.0-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file post_archiver-1.2.0.tar.gz.

File metadata

  • Download URL: post_archiver-1.2.0.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for post_archiver-1.2.0.tar.gz
Algorithm Hash digest
SHA256 de0cf1e34195a0d6643fce499465c25c67a17569b7ebebfd25f4429de4169fd7
MD5 44a0cbc358967689c1cbe6260e501c9e
BLAKE2b-256 e3a2139e53fa46b427fb8c2f51d439d39a4aa155025a1f48bc53f7e91e631841

See more details on using hashes here.

File details

Details for the file post_archiver-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for post_archiver-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 629dc4ec3237185b88804662e570a68f767c8e16be6ffcd1c41c0854bcb1575b
MD5 2b92227d118f7781161ba89a8d8e4d45
BLAKE2b-256 edaaee9fda3eeef77035fa0d2d31b9bda4829b2cd4bb463f748fae89b01bcb39

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page