Skip to main content

File scrapper for clients to sync with a mendia rust application running on a server.

Project description

Mendia File Scraper

About

This is a client for mendia. It indexes local media (currently limited to movies), stores the findings in a local database and publishes new additions to the server running mendia.

Installation:

sudo apt update
sudo apt install libmediainfo0v5
pip install mendiafilescraper

Note: This package needs the mediainfo library.

Ubuntu/Debian: 'libmediainfo0v5'

Arch: 'libmediainfo'

Verwendung:

--setup:

Asks for

  • Username
  • Password
  • Media folders
  • Server address (e.g wss://hostname/ws/, depending on the mendia server)

and stores everything in a config file in the home directory

~/.mendiafilescraper/config.txt

--scan:

Searches all given media folders for new additions and adds them to the database.

--publish:

Sends new additions to the configured mendia server. Use only with --scan

Example:

Settings

mendia-scraper --setup

Add media paths, specify the server address and put in your username and password.

Note: Ask the admin of your target mendia server to create a username/password for you.

First scan

The initial scan populates the local database. --publish should not be used for the first scan, otherwise this might flood the server.

mendia-scraper --scan

Warning: Make sure that the initial scan worked before proceeding.

Real scan

After the first scan we can add --publish, from now on new movies will be sent to mendia.

mendia-scraper --scan --publish

Cronjob

It makes sense to add the scan command to your crontab for automatic scans.

crontab -e

For a daily scan add

@daily mendia-scraper --scan --publish

Problems:

I have a new movie but mendia did not inform about it

It is possible to delete movies from the local database and to retry scanning the movie.

Note: It is easier to use a gui application with sqlite support, but on typical NAS systems there is no gui.

sudo apt install sqlite3
cd ~/.MendiaFileScraper
sqlite3 database.db

Let's say we want to remove the movie "Captive State".

In the sqlite3 shell:

SELECT title, hash FROM movies WHERE instr(title, 'Captive') > 0;

If you do not see any result, play with the search parameters until you found it.

Let's say our result is:

Captive State|a67edf9ee879a7562c17092b97dfe672

The second value is the hash value that was computed based on the movie file. To delete the entry with the has "a67edf9ee879a7562c17092b97dfe672" we do:

DELETE FROM movies WHERE hash="a67edf9ee879a7562c17092b97dfe672";

CTRL+D to exit from the sqlite3 shell.

Voila, the movie was removed and you can retry scanning with

mendia-scraper --scan --publish

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mendiafilescraper-3.0.2.tar.gz (12.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mendiafilescraper-3.0.2-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file mendiafilescraper-3.0.2.tar.gz.

File metadata

  • Download URL: mendiafilescraper-3.0.2.tar.gz
  • Upload date:
  • Size: 12.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.11.1 Linux/5.4.109+

File hashes

Hashes for mendiafilescraper-3.0.2.tar.gz
Algorithm Hash digest
SHA256 6932f1653008b78d88fecdfff19c728b11b568690a94aedc5e13b66b368084e6
MD5 def5f0e0984b951ef842c3b3659ffb19
BLAKE2b-256 268abc8f930b223a5a54c321ffbad8e6946203ec695803db5b502e75b6ec29d9

See more details on using hashes here.

File details

Details for the file mendiafilescraper-3.0.2-py3-none-any.whl.

File metadata

  • Download URL: mendiafilescraper-3.0.2-py3-none-any.whl
  • Upload date:
  • Size: 13.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.11.1 Linux/5.4.109+

File hashes

Hashes for mendiafilescraper-3.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4dd0e85db8b029dcdbd13d049d0438980c22e632b2007b520b20b8c8317a0b17
MD5 a50e2c02308bd29d703ebef097b41f12
BLAKE2b-256 37c42952181a6a05b1bc327beed7ccccdc24c3d1a988973c4c042a464e5385e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page