Skip to main content

automatically scrape onlyfans

Project description

Intro

A fork of onlyfans-scraper. It has been optimized to make it more feature complete with digitalcriminal's onlyfans script. A matter of fact with the right settings transitioning between the two scripts should be a easy enough process

In addition there are numerous filtering features to control exactly which type of content you want to scrape. https://github.com/datawhores/OF-Scraper/blob/main/CHANGES.md

DISCLAIMERS:

  1. This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
  2. This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
  3. Description:

    command-line program to download media, and to process other batch operations such as liking and unliking posts.

    CopyQ nsUBdI

    Installation

    Recommended python3.9 or python3.10

    Windows:

    stable

    pip install ofscraper
    

    or

    development

    pip install git+https://github.com/datawhores/OF-Scraper.git 
    

    or

    specific version

    pip install ofscraper==x
    

    where x is the vesion you want to install

    macOS/Linux

    pip3 install ofscraper
    

    or

    pip3 install git+https://github.com/datawhores/OF-Scraper.git 
    

    or

    specific version

    pip install ofscraper==x
    

    where x is the vesion you want to install

    Authentication

    You'll need to retrive your auth information

    https://github.com/datawhores/OF-Scraper/wiki/Auth

    Usage

    Whenever you want to run the program, all you need to do is type ofscraper in your terminal:

    Basic usage is just to run the command below

    ofscraper
    

    image

    Select "Download content from a user" is all your need to get started with downloading content

    Liking/Unliking

    It is also possible to batch unlike or like post by choosing the appropriate option in the menu Note their are limitations (currently 1000) on how many post you can like per day

    image

    Selecting specific users

    The fuzzy search system can be a little confusing see

    https://github.com/datawhores/OF-Scraper/wiki/Fuzzy-Search

    Other menu options

    See: https://github.com/datawhores/OF-Scraper/wiki/Menu-Options

    commandline

    While the menu is easy to use, and convient. You may want more automation.

    The best way to do this is through the commandline system. This will allow you to skip the menu, and for example scrape a provided list of accounts

    https://github.com/datawhores/OF-Scraper/wiki/command-line-args

    Docker Support

    https://github.com/datawhores/OF-Scraper/pkgs/container/ofscraper

    Customazation

    https://github.com/datawhores/OF-Scraper/wiki/Customizing-save-path https://github.com/datawhores/OF-Scraper/wiki/Config-Options

    Issues

    Open a issue in this repo, or you can mention your issue in the discord

    https://discord.gg/wN7uxEVHRK

    Feature Requests

    https://ofscraper.clearflask.com/feedback

    Or the discord

    Common

    Status Down

    This typically means that your auth information is not correct, or onlyfans signed you out.

    404 Issue

    This could mean that the content you are trying to scrape is no longer present. It can also indicate that model has deleted her account, and it is no longer accesible on the platform

    Request taking a long time

    If a request fails ofscraper will pause and try again a few times. This can lead to certain runs taking longer and points.

    Known Limitations

    • 1000 likes is the max per day

    Migrating from DC script

    You will need to change the settings so that the metadata option is compatible with your current folders Additionally you might want to set the save_path, dir_path, and filename so they output similar outputs

    The metadata path from digitalcriminal's script is used for duplicate check so as long as your set the right path. Files won't download a second time

    https://github.com/datawhores/OF-Scraper/wiki/Migrating-from-DC-script https://github.com/datawhores/OF-Scraper/wiki/Config-Options https://github.com/datawhores/OF-Scraper/wiki/Customizing-save-path

    Ask in the discord or open an issue if you need help with what to change to accomplish this

    Discord

    https://discord.gg/wN7uxEVHRK

    Support

    buymeacoffee.com/datawhores

    BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87

    ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609

    codecov

Project details


Release history Release notifications | RSS feed

This version

2.4

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ofscraper-2.4.tar.gz (57.9 kB view details)

Uploaded Source

Built Distribution

ofscraper-2.4-py3-none-any.whl (76.6 kB view details)

Uploaded Python 3

File details

Details for the file ofscraper-2.4.tar.gz.

File metadata

  • Download URL: ofscraper-2.4.tar.gz
  • Upload date:
  • Size: 57.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1038-azure

File hashes

Hashes for ofscraper-2.4.tar.gz
Algorithm Hash digest
SHA256 ab4808aa6f4a3fe7d8ec9780e8db1cbc00c44e9288c633f74c79566d90e5433e
MD5 37a75a34e92b765f873feaa911a0fca8
BLAKE2b-256 b344e7c9c3a567cf0b9a75589f79e83d63aaa0a57a7a55f9628d7a8de98a6b5d

See more details on using hashes here.

File details

Details for the file ofscraper-2.4-py3-none-any.whl.

File metadata

  • Download URL: ofscraper-2.4-py3-none-any.whl
  • Upload date:
  • Size: 76.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/5.15.0-1038-azure

File hashes

Hashes for ofscraper-2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c8d867829bca98bb6aeb34b011e81d007d7c966787a9653f50633ad8358b10a4
MD5 50a445641789d162949dddf91dba8b49
BLAKE2b-256 47af175d8a19ee90b2e300fa8de8c566307f77c5cac55fd0b30dfe01b1b089cc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page