Skip to main content

automatically scrape onlyfans

Project description

Table-of-contents

Intro

A fork of onlyfans-scraper. It has been optimized to make it more feature complete with DIGITALCRIMINALS' onlyfans script. A matter of fact with the right settings transitioning between the two scripts should be a easy enough process

In addition there are numerous filtering features to control exactly which type of content you want to scrape. https://github.com/datawhores/OF-Scraper/blob/main/CHANGES.md

Documentation

For information on how to

  • install
  • run
  • and other things you might want to know

https://of-scraper.gitbook.io/of-scraper

DISCLAIMERS:

  1. This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
  2. This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
  3. Description:

    command-line program to download media, and to process other batch operations such as liking and unliking posts.

    CopyQ nsUBdI

    Issues

    Open a issue in this repo, or you can mention your issue in the discord https://discord.gg/DV3aBeMu

    Feature Requests

    https://ofscraper.clearflask.com/feedback

    Or the discord

    Migrating from DC script

    You will need to change the settings so that the metadata option is compatible with your current folders Additionally you might want to set the save_path, dir_path, and filename so they output similar outputs

    The metadata path from DIGITALCRIMINALS' script is used for duplicate check so as long as your set the right path. Files won't download a second time

    https://of-scraper.gitbook.io/of-scraper/migrating-from-digitalcriminals-script

    https://of-scraper.gitbook.io/of-scraper/config-options

    https://of-scraper.gitbook.io/of-scraper/config-options/customizing-save-path

    Ask in the discord or open an issue if you need help with what to change to accomplish this

    Discord

    https://discord.gg/wN7uxEVHRK

    Support

    buymeacoffee.com/datawhores

    BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87

    ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609

    codecov

Project details


Release history Release notifications | RSS feed

This version

2.7.8

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ofscraper-2.7.8.tar.gz (73.1 kB view details)

Uploaded Source

Built Distribution

ofscraper-2.7.8-py3-none-any.whl (98.2 kB view details)

Uploaded Python 3

File details

Details for the file ofscraper-2.7.8.tar.gz.

File metadata

  • Download URL: ofscraper-2.7.8.tar.gz
  • Upload date:
  • Size: 73.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1041-azure

File hashes

Hashes for ofscraper-2.7.8.tar.gz
Algorithm Hash digest
SHA256 543bcbd5b3c41c07b0dbeee81ae77f6a505310f32dc52426d0f4c62bc943c66a
MD5 360658daca790fcf2b45e1291322e3d9
BLAKE2b-256 7dc02e785902feb78f53a0c55cdc8d8b58ad41fb004d22510e43914cb17d9b2d

See more details on using hashes here.

File details

Details for the file ofscraper-2.7.8-py3-none-any.whl.

File metadata

  • Download URL: ofscraper-2.7.8-py3-none-any.whl
  • Upload date:
  • Size: 98.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1041-azure

File hashes

Hashes for ofscraper-2.7.8-py3-none-any.whl
Algorithm Hash digest
SHA256 0a96d6c2d264051489b817173da45dd7dd34b531b3cc29f1aa5bb9b9936489db
MD5 3cf0c7667a5f3be7d8ad251e0e2cd077
BLAKE2b-256 b4a21bbfe77cb447a9becf46356849ff5c65c277cb5d0d48c44e7431fc350ffc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page