Skip to main content

automatically scrape onlyfans

Project description

Table-of-contents

Intro

A fork of onlyfans-scraper. It has been optimized to make it more feature complete with DIGITALCRIMINALS' onlyfans script. A matter of fact with the right settings transitioning between the two scripts should be a easy enough process

In addition there are numerous filtering features to control exactly which type of content you want to scrape. https://github.com/datawhores/OF-Scraper/blob/main/CHANGES.md

Documentation

For information on how to

  • install
  • run
  • and other things you might want to know

https://of-scraper.gitbook.io/of-scraper

DISCLAIMERS:

  1. This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
  2. This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
  3. Description:

    command-line program to download media, and to process other batch operations such as liking and unliking posts.

    CopyQ nsUBdI

    Issues

    Open a issue in this repo, or you can mention your issue in the discord https://discord.gg/DV3aBeMu

    Feature Requests

    https://ofscraper.clearflask.com/feedback

    Or the discord

    Migrating from DC script

    You will need to change the settings so that the metadata option is compatible with your current folders Additionally you might want to set the save_path, dir_path, and filename so they output similar outputs

    The metadata path from DIGITALCRIMINALS' script is used for duplicate check so as long as your set the right path. Files won't download a second time

    https://of-scraper.gitbook.io/of-scraper/migrating-from-digitalcriminals-script

    https://of-scraper.gitbook.io/of-scraper/config-options

    https://of-scraper.gitbook.io/of-scraper/config-options/customizing-save-path

    Ask in the discord or open an issue if you need help with what to change to accomplish this

    Discord

    https://discord.gg/wN7uxEVHRK

    Support

    buymeacoffee.com/datawhores

    BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87

    ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609

    codecov

Project details


Release history Release notifications | RSS feed

This version

2.7.7

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ofscraper-2.7.7.tar.gz (73.1 kB view details)

Uploaded Source

Built Distribution

ofscraper-2.7.7-py3-none-any.whl (98.2 kB view details)

Uploaded Python 3

File details

Details for the file ofscraper-2.7.7.tar.gz.

File metadata

  • Download URL: ofscraper-2.7.7.tar.gz
  • Upload date:
  • Size: 73.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1042-azure

File hashes

Hashes for ofscraper-2.7.7.tar.gz
Algorithm Hash digest
SHA256 50cbc8dbff1df52472b1b6e3ac173bbc1348f58aa798364c099f03f0918c9040
MD5 c8b2c5457a6e87422f159504e8052105
BLAKE2b-256 d5e10cd212a182527635dbaf4254686141789b86b59687ed4255b49753ed54b2

See more details on using hashes here.

File details

Details for the file ofscraper-2.7.7-py3-none-any.whl.

File metadata

  • Download URL: ofscraper-2.7.7-py3-none-any.whl
  • Upload date:
  • Size: 98.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1042-azure

File hashes

Hashes for ofscraper-2.7.7-py3-none-any.whl
Algorithm Hash digest
SHA256 31fb528ff1a70a14433c1ac8d3eb2880016c9346cab1f0b760b52e47d8d9c137
MD5 d64c918ae86893ad0cce92703ef7b6a0
BLAKE2b-256 c80faf2bb45e2ecca8f3405187ad68fea7bdcc1fd8d99da3fa2eab7bd7805ab7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page