automatically scrape onlyfans
Project description
ReadMe
Releases
Docker and binary releases also availible
Stable
Dev
None
Table-of-contents
Description
command-line tool that lets you download media from OnlyFans and perform bulk actions including liking or unliking posts.
At its inception, this project emerged as a fork of the original onlyfans-scraper. With invaluable support from the community and consistent script updates, we've undergone substantial architectural changes, resulting in a significantly revamped codebase compared to the original master. While some of these modifications are detailed HERE, most are only documented in the commit history.
This script has been thoughtfully crafted to facilitate seamless transitions from DIGITALCRIMINALS' script, ensuring robust compatibility and smooth feature migration. Furthermore, it boasts an extensive array of filtering features, empowering users with precise control over the specific content types they aim to scrape
Documentation
For detailed instructions on:
- Installation
- Running the tool
- And other pertinent information you might need
DISCLAIMERS:
- This tool is not affiliated, associated, or partnered with OnlyFans in any way. We are not authorized, endorsed, or sponsored by OnlyFans. All OnlyFans trademarks remain the property of Fenix International Limited.
- This is a theoritical program only and is for educational purposes. If you choose to use it then it may or may not work. You solely accept full responsability and indemnify the creator, hostors, contributors and all other involved persons from any any all responsability.
Issues
Open a issue in this repo, or you can mention your issue in the Discord
Feature request issues are fine Bug Report Issues without required material will be closed
Private Reports
A ticket can be created in the ticket channel only you and admins have access to ticket discussions
Feature Requests
ClearFlask Feedback or Discord
Migrating from DC script
To maintain compatibility with your current folders, make sure to modify the metadata option within the config file. Additionally, configure the save_path, dir_path, and filename settings to generate outputs that align with your existing setup.
The inherited metadata files from DIGITALCRIMINALS' script play a crucial role in preventing redundant downloads by acting as a check for duplicates.
For comprehensive guidance on making these adjustments, you can refer to the provided resources
Ask in the discord or open an issue if you need help with what to change to accomplish this
Discord
Support
buymeacoffee.com/datawhores
BTC: bc1qcnnetrgmtr86rmqvukl2e24f5upghpcsqqsy87
ETH: 0x1d427a6E336edF6D95e589C16cb740A1372F6609
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ofscraper-3.10.1.tar.gz
.
File metadata
- Download URL: ofscraper-3.10.1.tar.gz
- Upload date:
- Size: 191.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.5 Linux/6.5.0-1021-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 537b072d7c95914423000c216bc86dd9b2806a56ddf80dd09d85ff00c2f36253 |
|
MD5 | 4795e38c4a5efddfb4c12b07dde60742 |
|
BLAKE2b-256 | b29cd3355b8fc9b8d8eb16c6b587513831ad77d903e00df4b8cbd09e3088db51 |
File details
Details for the file ofscraper-3.10.1-py3-none-any.whl
.
File metadata
- Download URL: ofscraper-3.10.1-py3-none-any.whl
- Upload date:
- Size: 300.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.5 Linux/6.5.0-1021-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4c46d387ce1c39ce5ee83baa3d876c8331d36332278bf785b48af01f213ef09d |
|
MD5 | fd4fa3503d76250bb9e76c5e3d0d741b |
|
BLAKE2b-256 | 1aa34e74105efeed4c913616c2088b97deeb283a78ead64b8571d0622686533f |