Skip to main content

Pure Python program to download any user's gallery/scraps/favorites and more from FurAffinity forum in an easily handled database.

Reason this release was yanked:

Error in instance check, cannot run

Project description

FALocalRepo

version_pypi version_gitlab version_python

issues_gitlab issues_github

Pure Python program to download any user's gallery/scraps/favorites from the Fur Affinity forum in an easily handled database.

Introduction

This program was born with the desire to provide a relatively easy-to-use method for FA users to download submissions that they care about from the forum.

The data is stored into a SQLite database, and the submissions files are saved in a tiered tree structure based on their ID's. Using SQLite instead of a client-server database makes the program to be extremely portable, only needing a working Python 3.8+ installation to work, and allows the downloaded data to be moved and backed up by simply moving/copying the database file and submission files folder.

All download operations are performed through the custom Fur Affinity scraping library faapi. To ensure proper crawling behavior the library strictly follows Fur Affinity's robots.txt in regard to allowed paths and crawl delay. Furthermore, submission files downloads are throttled to 100 KB/s to ensure the program won't use too much bandwidth.

The database and file-storage functions are handled independently by the falocalrepo-database package which performs all transactions, queries, and file operations.

The falocalrepo-server package is used to provide the server functionalities of the program.

Contents

  1. Installation and Update
  2. Cookies
  3. Usage
    1. Environmental Variables
    2. Help
    3. Init
    4. Configuration
    5. Download
    6. Database
  4. Database
    1. Settings
    2. Users
    3. Submissions
    4. Journals
  5. Submission Files
  6. Upgrading Database
  7. Contributing
  8. Issues
  9. Appendix

Installation and Update

To install the program it is sufficient to use Python pip and get the package falocalrepo.

python3 -m pip install falocalrepo

Python 3.8 or above is needed to run this program, all other dependencies are handled by pip during installation. For information on how to install Python on your computer, refer to the official website Python.org.

To upgrade the falocalrepo and its dependencies, use pip to upgrade all three components.

python3 -m pip install --upgrade falocalrepo faapi falocalrepo-database falocalrepo-server

A message will be displayed when running the program if there is an update available for any component.

The program needs cookies from a logged-in Fur Affinity session to download protected pages. Without the cookies the program can still download publicly available pages, but others will return empty. See #Cookies for more details on which cookies to use.

Warning: Fur Affinity theme template must be set to "modern". Can be changed at furaffinity.net/controls/settings/.

Cookies

The scraping library used by this program needs two specific cookies from a logged-in Fur Affinity session. These are cookie a and cookie b.

As of 2020-08-09 these take the form of hexadecimal strings like 356f5962-5a60-0922-1c11-65003b703038.

The easiest way to obtain these cookies is by using a browser extension to extract your cookies and then search for a and b.
Alternatively, the storage inspection tool of a desktop browser can also be used. For example on Mozilla's Firefox this can be opened with the ⇧F9 shortcut.

To set the cookies use the config cookies command. See #Configuration for more details.

Usage

How to Read Usage Instructions

  • command a static command keyword
  • <arg> <param> <value> an argument, parameter, value, etc... that must be provided to a command
  • [<arg>] an optional argument that can be omitted
  • <arg1> | <arg2> mutually exclusive arguments, only use one

To run the program, simply call falocalrepo in your shell after installation.

Running without arguments will prompt a help message with all the available options and commands.

The usage pattern for the program is as follows:

falocalrepo [-h | -v | -d | -s] [<command> [<operation>] [<arg1> ... <argN>]]

Available options are:

  • -h, --help show help message
  • -v, --version show program version
  • -d, --database show database version
  • -s, --server show server version

Available commands are:

  • help display the manual of a command
  • init create the database and exit
  • config manage settings
  • download perform downloads
  • database operate on the database

Note: all the commands except help will create and initialise the database if it is not present in the folder

Note: only one instance of the program is allowed at any given time

When the database is first initialised, it defaults the submissions files folder to FA.files. This value can be changed using the config command.

Cookies need to be set manually with the config command before the program will be able to access protected pages.

Environmental Variables

falocalrepo supports the following environmental variables:

  • FALOCALREPO_DATABASE sets a path for the database rather than using the current folder. If the path basename ends with .db -- e.g. ~/Documents/FA/MyFA.db -- , then a database file will be created/opened with that name. Otherwise, the path will be considered a folder, and a database named "FA.db" will be created therein.

Help

help [<command> [<operations>]]

The help command gives information on the usage of the program and its commands and operations.

falocalrepo help
falocalrepo help download
falocalrepo help database search-users

Init

The init command initialises the database or, if one is already present, updates to a new version - if available - and then exits.

It can be used to create the database and then manually edit it, or to update it to a new version without calling other commands.

Configuration

config [<setting> [<value1>] ... [<valueN>]]

The config command allows to change the settings used by the program.

Running the command alone will list the current values of the settings stored in the database. Running config <setting> without value arguments will show the current value of that specific setting.

Available settings are:

  • list list stored settings.
  • cookies [<cookie a> <cookie b>] the cookies stored in the database.
falocalrepo config cookies 38565475-3421-3f21-7f63-3d341339737 356f5962-5a60-0922-1c11-65003b703038
  • files-folder [<new folder>] the folder used to store submission files. This can be any path relative to the folder of the database. If a new value is given, the program will move any files to the new location.
falocalrepo config files-folder SubmissionFiles

Download

download <operation> [<option>=<value>] [<arg1>] ... [<argN>]

The download command performs all download and repository update operations.

Available operations are:

  • users <user1>[,...,<userN>] <folder1>[,...,<folderN>] download specific user folders. Requires two arguments with comma-separated users and folders. Prepending list- to a folder allows to list all remote items in a user folder without downloading them. Supported folders are:
    • gallery
    • scraps
    • favorites
    • journals
falocalrepo download users tom,jerry gallery,scraps,journals
falocalrepo download users tom,jerry list-favorites
  • update [stop=<n>] [<user1>,...,<userN>] [<folder1>,...,<folderN>] update the repository by checking the previously downloaded folders (gallery, scraps, favorites or journals) of each user and stopping when it finds a submission that is already present in the repository. Can pass a list of users and/or folders that will be updated if in the database. To skip users, use @ as argument. The stop=<n> option allows to stop the update after finding n submissions in a user's database entry, defaults to 1. If a user is deactivated, the folders in the database will be prepended with a '!', and the user will be skipped when update is called again.
falocalrepo download update stop=5
falocalrepo download update @ gallery,scraps
falocalrepo download update tom,jerry
  • submissions <id1> ... [<idN>] download specific submissions. Requires submission IDs provided as separate arguments.
falocalrepo download submissions 12345678 13572468 87651234
  • journals <id1> ... [<idN>] download specific journals. Requires journal IDs provided as separate arguments.
falocalrepo download journals 123456 135724 876512

Database

database [<operation> [<param1>=<value1> ... <paramN>=<valueN>]]

The database command allows operating on the database. Used without an operation command shows the database information, statistics (number of users and submissions and time of last update), and version.

All search operations are conducted case-insensitively using the SQLite like expression which allows for a limited pattern matching. For example this expression can be used to search two words together separated by an unknown amount of characters %cat%mouse%. Fields missing wildcards will only match an exact result, i.e. cat will only match a field equal to cat tag whereas %cat% wil match a field that has contains cat.

All search operations support the extra order, limit, and offset parameters with values in SQLite ORDER BY clause, LIMIT clause format, and OFFSET clause. The order parameter supports all fields of the specific search command.

Available operations are:

  • info show database information, statistics and version.
  • history show commands history
  • search-users [<param1>=<value1>] ... [<paramN>=<valueN>] search the users entries using metadata fields. Search parameters can be passed multiple times to act as OR values. All columns of the users table are supported: #Users. Parameters can be lowercase. If no parameters are supplied, a list of all users will be returned instead.
falocalrepo database search-users folders=%gallery% gallery=%0012345678%
  • search-submissions [<param1>=<value1>] ... [<paramN>=<valueN>] search the submissions entries using metadata fields. Search parameters can be passed multiple times to act as OR values. All columns of the submissions table are supported: #Submissions. Parameters can be lowercase. If no parameters are supplied, a list of all submissions will be returned instead.
falocalrepo database search-submissions tags=%cat,%mouse% date=2020-% category=%artwork% order="AUTHOR" order="ID"
falocalrepo database search-submissions tags=%cat% tags=%mouse% date=2020-% category=%artwork%
  • search-journals [<param1>=<value1>] ... [<paramN>=<valueN>] search the journals entries using metadata fields. Search parameters can be passed multiple times to act as OR values. All columns of the journals table are supported: #Journals. Parameters can be lowercase. If no parameters are supplied, a list of all journals will be returned instead.
falocalrepo database search-journals date=2020-% author=CatArtist order="ID DESC"
falocalrepo database search-journals date=2020-% date=2019-% content=%commission%
  • add-submission <param1>=<value1> ... <paramN>=<valueN> add a submission to the database manually. The submission file is not downloaded and can instead be provided with the extra parameter file_local_url. The following parameters are necessary for a submission entry to be accepted:
    • id submission id
    • title
    • author
    • date date in the format YYYY-MM-DD
    • category
    • species
    • gender
    • rating
      The following parameters are optional:
    • tags comma-separated tags
    • description
    • file_url the url of the submission file, not used to download the file
    • file_local_url if provided, take the submission file from this path and put it into the database
falocalrepo database add-submission id=12345678 'title=cat & mouse' author=CartoonArtist \
    date=2020-08-09 category=Artwork 'species=Unspecified / Any' gender=Any rating=General \
    tags=cat,mouse,cartoon 'description=There once were a cat named Tom and a mouse named Jerry.' \
    'file_url=http://remote.url/to/submission.file' file_local_url=path/to/submission.file
  • add-journal <param1>=<value1> ... <paramN>=<valueN> add a journal to the database manually. The following parameters are necessary for a journal entry to be accepted:
    • id journal id
    • title
    • author
    • date date in the format YYYY-MM-DD
      The following parameters are optional:
    • content the body of the journal
falocalrepo database add-journal id=12345678 title="An Update" author=CartoonArtist \
    date=2020-08-09 content="$(cat journal.html)"
  • remove-users <user1> ... [<userN>] remove specific users from the database.
falocalrepo database remove-users jerry
  • remove-submissions <id1> ... [<idN>] remove specific submissions from the database.
falocalrepo database remove-submissions 12345678 13572468 87651234
  • remove-journals <id1> ... [<idN>] remove specific journals from the database.
falocalrepo database remove-journals 123456 135724 876512
  • server [host=<host>] [port=<port>] starts a server at <host>:<port> to navigate the database using falocalrepo-server. Defaults to 0.0.0.0:8080. See falocalrepo-server for more details on usage.
falocalrepo database server host=127.0.0.1 port=5000
  • merge <path> merge (or create) the database in the current folder with a second database located at path. path must point to the database file itself.
falocalrepo database merge ~/Documents/FA/FA.db
  • clean clean the database using the SQLite VACUUM function. Requires no arguments.

Database

To store the metadata of the downloaded submissions, journals, users, cookies and statistics, the program uses a SQLite3 database. This database is built to be as light as possible while also containing all the metadata that can be extracted from a submission page.

To store all this information, the database uses four tables: SETTINGS, USERS, SUBMISSIONS and JOURNALS.

Settings

The settings table contains settings for the program and statistics of the database.

  • USRN number of users in the USERS table
  • SUBN number of submissions in the SUBMISSIONS table
  • HISTORY list of executed commands in the format [[<time1>, "<command1>"], ..., [<timeN>, "<commandN>"]] (UNIX time in seconds)
  • COOKIES cookies for the scraper, stored in JSON format
  • FILESFOLDER location of downloaded submission files
  • VERSION database version, this can differ from the program version

Users

The users table contains a list of all the users that have been download with the program, the folders that have been downloaded, and the submissions found in each of those.

Each entry contains the following fields:

  • USERNAME The URL username of the user (no underscores or spaces)
  • FOLDERS the folders downloaded for that specific user.
  • GALLERY
  • SCRAPS
  • FAVORITES
  • MENTIONS this is a legacy entry used by the program up to version 2.11.2 (was named EXTRAS)
  • JOURNALS

Submissions

The submissions table contains the metadata of the submissions downloaded by the program and information on their files

  • ID the id of the submission
  • AUTHOR the username of the author (uploader) in full format
  • TITLE
  • DATE upload date in the format YYYY-MM-DD
  • DESCRIPTION description in html format
  • TAGS keywords sorted alphanumerically and comma-separated
  • CATEGORY
  • SPECIES
  • GENDER
  • RATING
  • FILELINK the remote URL of the submission file
  • FILEEXT the extensions of the downloaded file. Can be empty if the file contained errors and could not be recognised upon download
  • FILESAVED 1 if the file was successfully downloaded and saved, 0 if there was an error during download

Journals

The journals table contains the metadata of the journals downloaded by the program.

  • ID the id of the journal
  • AUTHOR the username of the author (uploader) in full format
  • TITLE
  • DATE upload date in the format YYYY-MM-DD
  • CONTENT content in html format

Submission Files

Submission files are saved in a tiered tree structure based on their submission ID. IDs are zero-padded to 10 digits and then broken up in 5 segments of 2 digits; each of these segments represents a folder tha will be created in the tree.

For example, a submission 1457893 will be padded to 0001457893 and divided into 00, 01, 45, 78, 93. The submission file will then be saved as 00/01/45/78/93/submission.file with the correct extension extracted from the file itself - Fur Affinity links do not always contain the right extension.

Upgrading Database

When the program starts, it checks the version of the database against the one used by the program and if the latter is more advanced it upgrades the database.

Note: Versions before 2.7.0 are not supported by falocalrepo version 3.0.0 and above. To update from those to the new version use version 2.11.2 to update the database to version 2.7.0

For details on upgrades and changes between database versions, see falocalrepo-database.

Contributing

All contributions and suggestions are welcome!

The only requirement is that any merge request must be sent to the GitLab project as the one on GitHub is only a mirror: GitLab/FALocalRepo

If you have suggestions for fixes or improvements, you can open an issue with your idea, see #Issues for details.

Issues

If any problem is encountered during usage of the program, an issue can be opened on the project's pages on GitLab (preferred) or GitHub (mirror repository).

Issues can also be used to suggest improvements and features.

When opening an issue for a problem, please copy the error message and describe the operation in progress when the error occurred.

Appendix

Earlier Releases

Release 3.0.0 was deleted from PyPi because of an error in the package information. However, it can still be found in the code repository under tag v3.0.0.

Release binaries for versions 2.11.x can be found on GitLab under tags -> FALocalRepo/tags 2.11

Release binaries before and including 2.10.2 can be found on GitHub -> Releases.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

falocalrepo-3.13.14.tar.gz (31.9 kB view hashes)

Uploaded Source

Built Distribution

falocalrepo-3.13.14-py3-none-any.whl (26.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page