Skip to main content

A tool to scrape YouTube community posts

Project description

YouTube Community Scraper

A Python tool to scrape posts from YouTube community tabs.

Features

  • Scrape posts from YouTube community tabs
  • Download images from posts
  • Collect post comments
  • Multi-browser support (Chromium, Firefox, WebKit)
  • Automatic browser installation
  • Proxy support (HTTP/HTTPS with auth, SOCKS5 without auth)
  • Progress saving
  • Configurable output directory

Installation

Install using pip:

pip install post-archiver

Or install from source:

git clone https://github.com/sadadYes/post-archiver.git
cd post-archiver
pip install -e .

Requirements

  • Python 3.7 or higher
  • No manual browser installation needed - browsers are installed automatically when needed

Usage

usage: post-archiver [OPTIONS] url [amount]

YouTube Community Posts Scraper

positional arguments:
  url                   YouTube channel community URL
  amount                Amount of posts to get (default: max)

options:
  -h, --help            show this help message and exit
  -c, --get-comments    Get comments from posts (WARNING: This is slow) (default: False)
  -i, --get-images      Get images from posts (default: False)
  -d, --download-images
                        Download images (requires --get-images)
  -q IMAGE_QUALITY, --image-quality IMAGE_QUALITY
                        Image quality: src, sd, or all (default: all)
  --proxy PROXY         Proxy file or single proxy string
  -o OUTPUT, --output OUTPUT
                        Output directory (default: current directory)
  -v, --verbose         Show basic progress information
  -t, --trace          Show detailed debug information
  --browser {chromium,firefox,webkit}
                        Browser to use (default: chromium)
  --version            show program's version number and exit
  --member-only         Only get membership-only posts (requires --cookies)
  --browser-cookies {chrome,firefox,edge,opera}
                        Get cookies from browser (requires browser-cookie3)

Proxy format:
  Single proxy: <scheme>://<username>:<password>@<host>:<port>
  Proxy file: One proxy per line using the same format
  Supported schemes: http, https
  Note: SOCKS5 proxies are supported but without authentication

Amount:
  Specify number of posts to scrape (default: max)
  Use 'max' or any number <= 0 to scrape all posts

Examples:
  post-archiver https://www.youtube.com/@channel/community
  post-archiver https://www.youtube.com/@channel/community 50
  post-archiver -c -i -d -q src https://www.youtube.com/@channel/community max
  post-archiver --browser firefox https://www.youtube.com/@channel/community
  post-archiver --proxy proxies.txt https://www.youtube.com/@channel/community 100
  post-archiver --proxy http://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy https://username:password@host:port https://www.youtube.com/@channel/community
  post-archiver --proxy socks5://host:port https://www.youtube.com/@channel/community

Browser Support

The scraper supports three browser engines:

  • Chromium (default)
  • Firefox
  • WebKit

The appropriate browser will be automatically installed when first used. You can specify which browser to use with the --browser option.

Proxy Support

The scraper supports the following proxy types:

  • HTTP proxies with authentication
  • HTTPS proxies with authentication
  • SOCKS5 proxies (without authentication)

Note: SOCKS5 proxies with authentication are not supported due to limitations in the underlying browser automation.

Logging

Two levels of logging are available:

  • --verbose (-v): Shows basic progress information
  • --trace (-t): Shows detailed debug information including browser console messages

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

post_archiver-1.2.2.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

post_archiver-1.2.2-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file post_archiver-1.2.2.tar.gz.

File metadata

  • Download URL: post_archiver-1.2.2.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.23

File hashes

Hashes for post_archiver-1.2.2.tar.gz
Algorithm Hash digest
SHA256 88e76dd7e19d3ee41f00b6a0bed6c1ebfed937aeeec290f22e0557b144fc0b81
MD5 eae016d0d237016159a9d05df576e9df
BLAKE2b-256 d8999aea14fb828a6e7adfec6d617a5878f46c644598803e1871f96e7caafff4

See more details on using hashes here.

File details

Details for the file post_archiver-1.2.2-py3-none-any.whl.

File metadata

  • Download URL: post_archiver-1.2.2-py3-none-any.whl
  • Upload date:
  • Size: 17.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.23

File hashes

Hashes for post_archiver-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 299a8e142fd26ddbc7e9bc56f32b11b4b244ec723c786cae39d6aa563f9edab7
MD5 cea655a536bf90b474676145db3198f6
BLAKE2b-256 9a4f1d6e846233019b7b0e49ce8d83af51d77edab8d2b32cfd086ecbc11910be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page