Skip to main content

File scrapper for clients to sync with a mendia rust application running on a server.

Project description

Mendia File Scraper

About

This is a client for mendia. It indexes local media (currently limited to movies), stores the findings in a local database and publishes new additions to the server running mendia.

Installation:

sudo apt update
sudo apt install libmediainfo0v5
pip install mendiafilescraper

Note: This package needs the mediainfo library.

Ubuntu/Debian: 'libmediainfo0v5'

Arch: 'libmediainfo'

Usage:

--setup:

Asks for

  • Username
  • Password
  • Media folders
  • Server address (e.g wss://hostname/ws/, depending on the mendia server)

and stores everything in a config file in the home directory

~/.mendiafilescraper/config.txt

--scan:

Searches all given media folders for new additions and adds them to the database.

--publish:

Sends new additions to the configured mendia server. Use only with --scan

Example:

Settings

mendia-scraper --setup

Add media paths, specify the server address and put in your username and password.

Note: Ask the admin of your target mendia server to create a username/password for you.

First scan

The initial scan populates the local database. --publish should not be used for the first scan, otherwise this might flood the server.

mendia-scraper --scan

Warning: Make sure that the initial scan worked before proceeding.

Real scan

After the first scan we can add --publish, from now on new movies will be sent to mendia.

mendia-scraper --scan --publish

Cronjob

It makes sense to add the scan command to your crontab for automatic scans.

crontab -e

For a daily scan add

@daily mendia-scraper --scan --publish

Problems:

I have a new movie but mendia did not inform about it

It is possible to delete movies from the local database and to retry scanning the movie.

Note: It is easier to use a gui application with sqlite support, but on typical NAS systems there is no gui.

sudo apt install sqlite3
cd ~/.MendiaFileScraper
sqlite3 database.db

Let's say we want to remove the movie "Captive State".

In the sqlite3 shell:

SELECT title, hash FROM movies WHERE instr(title, 'Captive') > 0;

If you do not see any result, play with the search parameters until you found it.

Let's say our result is:

Captive State|a67edf9ee879a7562c17092b97dfe672

The second value is the hash value that was computed based on the movie file. To delete the entry with the has "a67edf9ee879a7562c17092b97dfe672" we do:

DELETE FROM movies WHERE hash="a67edf9ee879a7562c17092b97dfe672";

CTRL+D to exit from the sqlite3 shell.

Voila, the movie was removed and you can retry scanning with

mendia-scraper --scan --publish

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mendiafilescraper-3.1.1.tar.gz (12.3 kB view details)

Uploaded Source

Built Distribution

mendiafilescraper-3.1.1-py3-none-any.whl (13.1 kB view details)

Uploaded Python 3

File details

Details for the file mendiafilescraper-3.1.1.tar.gz.

File metadata

  • Download URL: mendiafilescraper-3.1.1.tar.gz
  • Upload date:
  • Size: 12.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.11.1 Linux/5.4.109+

File hashes

Hashes for mendiafilescraper-3.1.1.tar.gz
Algorithm Hash digest
SHA256 be8f7fb3f2eec3fdaf6e71cfb6d5c14e70636c246bf2b30b8efd43cb2d0dea1f
MD5 10c57bbafe9564cfb97f7658b51ae1d4
BLAKE2b-256 942e2f4a25533e5aa8367a826df97893bad2461efb646dbe80c3bfedb3801522

See more details on using hashes here.

File details

Details for the file mendiafilescraper-3.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for mendiafilescraper-3.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 09aab1ecc8983e5ab283ae8e8a809d431ba05c204c1d4f4ee374fd9f95b5afa7
MD5 fa14b7d3fc946f2e8913b5816480bbf9
BLAKE2b-256 26ad500cd0d77a48fea2011fd11cd463035a1098b2eb8e2d6cea658051108c04

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page