Skip to main content

automatically scrape onlyfans

Project description

Intro

A fork of onlyfans-scraper. It has been optimized to make it more feature complete with digitalcriminal's onlyfans script. A matter of fact with the right settings transitioning between the two scripts should be a easy enough process

In addition there are numerous filtering features to control exactly which type of content you want to scrape. https://github.com/datawhores/OF-Scraper/blob/main/CHANGES.md

DISCLAIMERS:

  1. This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
  2. This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
  3. Description:

    command-line program to download media, and to process other batch operations such as liking and unliking posts.

    CopyQ nsUBdI

    Installation

    Recommended python3.9 or python3.10

    Windows:

    stable

    pip install ofscraper
    

    or

    development

    pip install git+https://github.com/datawhores/OF-Scraper.git 
    

    or

    specific version

    pip install ofscraper==x
    

    where x is the vesion you want to install

    macOS/Linux

    pip3 install ofscraper
    

    or

    pip3 install git+https://github.com/datawhores/OF-Scraper.git 
    

    or

    specific version

    pip install ofscraper==x
    

    where x is the vesion you want to install

    Authentication

    You'll need to retrive your auth information

    https://github.com/datawhores/OF-Scraper/wiki/Auth

    Usage

    Whenever you want to run the program, all you need to do is type ofscraper in your terminal:

    Basic usage is just to run the command below

    ofscraper
    

    image

    Select "Download content from a user" is all your need to get started with downloading content

    Liking/Unliking

    It is also possible to batch unlike or like post by choosing the appropriate option in the menu Note their are limitations (currently 1000) on how many post you can like per day

    image

    Selecting specific users

    The fuzzy search system can be a little confusing see

    https://github.com/datawhores/OF-Scraper/wiki/Fuzzy-Search

    Other menu options

    See: https://github.com/datawhores/OF-Scraper/wiki/Menu-Options

    commandline

    While the menu is easy to use, and convient. You may want more automation.

    The best way to do this is through the commandline system. This will allow you to skip the menu, and for example scrape a provided list of accounts

    https://github.com/datawhores/OF-Scraper/wiki/command-line-args

    Docker Support

    https://github.com/datawhores/OF-Scraper/pkgs/container/ofscraper

    Customazation

    https://github.com/datawhores/OF-Scraper/wiki/Customizing-save-path https://github.com/datawhores/OF-Scraper/wiki/Config-Options

    Issues

    Open a issue in this repo, or you can mention your issue in the discord

    https://discord.gg/wN7uxEVHRK

    Feature Requests

    https://ofscraper.clearflask.com/feedback

    Or the discord

    Common

    Status Down

    This typically means that your auth information is not correct, or onlyfans signed you out.

    404 Issue

    This could mean that the content you are trying to scrape is no longer present. It can also indicate that model has deleted her account, and it is no longer accesible on the platform

    Request taking a long time

    If a request fails ofscraper will pause and try again a few times. This can lead to certain runs taking longer and points.

    Known Limitations

    • 1000 likes is the max per day

    Migrating from DC script

    You will need to change the settings so that the metadata option is compatible with your current folders Additionally you might want to set the save_path, dir_path, and filename so they output similar outputs

    The metadata path from digitalcriminal's script is used for duplicate check so as long as your set the right path. Files won't download a second time

    https://github.com/datawhores/OF-Scraper/wiki/Migrating-from-DC-script https://github.com/datawhores/OF-Scraper/wiki/Config-Options https://github.com/datawhores/OF-Scraper/wiki/Customizing-save-path

    Ask in the discord or open an issue if you need help with what to change to accomplish this

    Discord

    https://discord.gg/wN7uxEVHRK

    Support

    buymeacoffee.com/datawhores

    BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87

    ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609

    codecov

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ofscraper-2.4.1.tar.gz (58.0 kB view details)

Uploaded Source

Built Distribution

ofscraper-2.4.1-py3-none-any.whl (76.7 kB view details)

Uploaded Python 3

File details

Details for the file ofscraper-2.4.1.tar.gz.

File metadata

  • Download URL: ofscraper-2.4.1.tar.gz
  • Upload date:
  • Size: 58.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1039-azure

File hashes

Hashes for ofscraper-2.4.1.tar.gz
Algorithm Hash digest
SHA256 adb72b43899246374817d6499d9fa4ae06b95bdb62713110f827147530cba7c1
MD5 bdf41caec825a6f086505b2f269c1376
BLAKE2b-256 74d613fc19f385aa5ed8661a7426365a6aedbefa8bc477366e2af5ffe4048233

See more details on using hashes here.

File details

Details for the file ofscraper-2.4.1-py3-none-any.whl.

File metadata

  • Download URL: ofscraper-2.4.1-py3-none-any.whl
  • Upload date:
  • Size: 76.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1039-azure

File hashes

Hashes for ofscraper-2.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ccedaa7adabbfdfda498d9248604bba0ccc276a018b416c549d1b4654f3574c0
MD5 a6b0cda449aae60602936fef427b22c0
BLAKE2b-256 a2de68ba4736c1f818cf0f803013bcf91ba8a55b355d556153f35cad3868f92e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page