Skip to main content

automatically scrape onlyfans

Reason this release was yanked:

code base incorrect

Project description

Table-of-contents

Intro

A fork of onlyfans-scraper. It has been optimized to make it more feature complete with DIGITALCRIMINALS' onlyfans script. A matter of fact with the right settings transitioning between the two scripts should be a easy enough process

In addition there are numerous filtering features to control exactly which type of content you want to scrape. https://github.com/datawhores/OF-Scraper/blob/main/CHANGES.md

Documentation

For information on how to

  • install
  • run
  • and other things you might want to know

https://of-scraper.gitbook.io/of-scraper

DISCLAIMERS:

  1. This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
  2. This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
  3. Description:

    command-line program to download media, and to process other batch operations such as liking and unliking posts.

    CopyQ nsUBdI

    Issues

    Open a issue in this repo, or you can mention your issue in the discord https://discord.gg/wN7uxEVHRK

    Feature Requests

    https://ofscraper.clearflask.com/feedback

    Or the discord

    Migrating from DC script

    You will need to change the settings so that the metadata option is compatible with your current folders Additionally you might want to set the save_path, dir_path, and filename so they output similar outputs

    The metadata path from DIGITALCRIMINALS' script is used for duplicate check so as long as your set the right path. Files won't download a second time

    https://of-scraper.gitbook.io/of-scraper/migrating-from-digitalcriminals-script

    https://of-scraper.gitbook.io/of-scraper/config-options

    https://of-scraper.gitbook.io/of-scraper/config-options/customizing-save-path

    Ask in the discord or open an issue if you need help with what to change to accomplish this

    Discord

    https://discord.gg/wN7uxEVHRK

    Support

    buymeacoffee.com/datawhores

    BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87

    ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609

    codecov

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ofscraper-3.3.3.1.tar.gz (99.9 kB view details)

Uploaded Source

Built Distribution

ofscraper-3.3.3.1-py3-none-any.whl (128.8 kB view details)

Uploaded Python 3

File details

Details for the file ofscraper-3.3.3.1.tar.gz.

File metadata

  • Download URL: ofscraper-3.3.3.1.tar.gz
  • Upload date:
  • Size: 99.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.3.13-1-liquorix-amd64

File hashes

Hashes for ofscraper-3.3.3.1.tar.gz
Algorithm Hash digest
SHA256 3897ea66778eeeadf8ef46903fc00b49678cad3d7a371f32e3974db8f0d0d1a2
MD5 ba446649eef506420a8600d33e4b8b27
BLAKE2b-256 7dd3eb76e579d3a3a10ac549afb334e95060f145c1d5d07a8aea7237e4351352

See more details on using hashes here.

File details

Details for the file ofscraper-3.3.3.1-py3-none-any.whl.

File metadata

  • Download URL: ofscraper-3.3.3.1-py3-none-any.whl
  • Upload date:
  • Size: 128.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.3.13-1-liquorix-amd64

File hashes

Hashes for ofscraper-3.3.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9951279408d10cc025db1ad49af91863bb5026fa1a14eb6a0c4776b76d1e86e1
MD5 dd6c405e26c3070bafe248796693c26d
BLAKE2b-256 6adf8b673703e96346badd794002fbd65a2896020ef046bd333e99bec340d73b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page