Skip to main content

External database feeder for the Dakara Project

Project description

Dakara Feeder

Appveyor CI Build status Codecov coverage analysis Code style: black PyPI version PyPI Python versions

Allows to feed the database of the Dakara server remotely.

Installation

This repo is tied with the Dakara server, so you should setup it first:

Other important parts of the project include:

System requirements

  • Python3, to make everything up and running (supported versions: 3.7, 3.8, 3.9, 3.10 and 3.11);
  • ffmpeg, to extract lyrics and extract metadata from files (preferred way);
  • MediaInfo, to extract metadata from files (slower, alternative way, may not work on Windows).

Linux and Windows are supported.

Virtual environment

It is strongly recommended to use the Dakara feeder within a virtual environment.

Install

Please ensure you have a recent enough version of setuptools:

pip install --upgrade "setuptools>=46.4.0"

Install the package with:

pip install dakarafeeder

If you have downloaded the repo, you can install the package directly with:

pip install .

Usage

Commands

The package provides the dakara-feeder feed command for creating data on a running instance of the Dakara server. Several sub-commands are available. To begin, dakara-feeder feed songs will find songs in the configured directory, parse them and send their data:

dakara-feeder feed songs
# or
python -m dakara_feeder feed songs

One instance of the Dakara server should be running.

The data extracted from songs are very limited in this package by default, as data can be stored in various ways. You are encouraged to make your own parser (see this section for more details).

Then, dakara-feeder feed tags and dakara-feeder feed work-types will find tags and work types in a YAML file (see this section for more details):

dakara-feeder feed tags path/to/tags.yaml
# or
python -m dakara_feeder feed tags path/to/tags.yaml

and:

dakara-feeder feed work-types path/to/work_types.yaml
# or
python -m dakara_feeder feed work-types path/to/work_types.yaml

Also, dakara-feeder feed works will find works in a JSON file (see this section for more details):

dakara-feeder feed works path/to/works.json
# or
python -m dakara_feeder feed works path/to/works.json

For more help:

dakara-feeder -h
# or
python -m dakara_feeder -h

Before calling any command, you should create a config file with:

dakara-feeder create-config
# or
python -m dakara_feeder create-config

and complete it with your values. The file is stored in your user space: ~/.config/dakara on Linux, or $APPDATA\DakaraProject\dakara on Windows.

Configuration

The configuration is created with the previously cited command. Several aspect of the feeder can be configured with this file. Please check with the file documentation.

Authentication to the server can be done with username and password, or with a token that can be copied from the web client. Please note that only a library manager can use the feeder.

Making a custom parser

To override the extraction of data from song files, you should create a class derived from dakara_feeder.song.BaseSong. Please refer to the documentation of this class to learn which methods to override, and what attributes and helpers are at your disposal.

Here is a basic example. It considers that the song video file is formatted in the way "title - main artist.ext":

# my_song.py
from dakara_feeder.song import BaseSong

class Song(BaseSong):
    def get_title(self):
        return self.video_path.stem.split(" - ")[0]

    def get_artists(self):
        return [{"name": self.video_path.stem.split(" - ")[1]}]

To register your customized Song class, you simply indicate it in the configuration file. You can either indicate an importable module or a file:

custom_song_class: path/to/my_song.py::Song
# or
custom_song_class: my_song.Song

Now, dakara-feeder will use your customized Song class instead of the default one.

Tags and work types file

Whilst data from songs are extracted directly from song files, data from tags and work types are extracted from a YAML file. All data can coexist in the same file.

Tags

Tags will be searched in the key tags. Tags are identified by their name (it will be displayed in upper case, it should be just one word). You can provide a color hue (positive integer from 0 to 360):

tags:
  - name: PV
    color_hue: 162
  - name: AMV
    color_hue: 140

Work types

Work types will be searched in the key worktypes Work types are identified by their query name (hyphenated name, with no special characters, used as keyword for querying). You can provide a work type display name (singular and plural) and an icon name (choosen among the FontAwesome font glyphes):

worktypes:
  - query_name: anime
    name: Anime
    name_plural: Animes
    icon_name: television
  - query_name: live-action
    name: Live action
    name_plural: Live actions
    icon_name: film

Works file

You can provide more information about works (especially alternative names) from a JSON file. The file should contain a dictionary where keys are work types query name and values lists of works representation:

{
  "work_type_1":
    [
      {
        "title": "Work 1",
        "subtitle": "Subtitle 1",
        "alternative_titles": [
          {
            "title": "AltTitle 1"
          },
          {
            "title": "AltTitle 2"
          }
        ]
      },
      {
        "title": "Work 2",
        "subtitle": "Subtitle 2"
      }
    ],
  "work_type_2": []
}

Identification with existing works on the server is made with the work type, the title and the subtitle, case insensitively.

Development

Please read the developers documentation.

MIT License

Copyright (c) 2022 Dakara Project

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dakarafeeder-1.8.0.tar.gz (49.8 kB view details)

Uploaded Source

Built Distribution

dakarafeeder-1.8.0-py3-none-any.whl (33.7 kB view details)

Uploaded Python 3

File details

Details for the file dakarafeeder-1.8.0.tar.gz.

File metadata

  • Download URL: dakarafeeder-1.8.0.tar.gz
  • Upload date:
  • Size: 49.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.8

File hashes

Hashes for dakarafeeder-1.8.0.tar.gz
Algorithm Hash digest
SHA256 581b4965834374927470b0fe5349b25565c7b9e727645b1dbdcd8b6adda76afb
MD5 44b55670bc7ca037b86de1a593365e38
BLAKE2b-256 f273882f8e26c2190fad0a5f4d04bc98fc894c037307a66400d822f6711eb81f

See more details on using hashes here.

File details

Details for the file dakarafeeder-1.8.0-py3-none-any.whl.

File metadata

  • Download URL: dakarafeeder-1.8.0-py3-none-any.whl
  • Upload date:
  • Size: 33.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.8

File hashes

Hashes for dakarafeeder-1.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ee6b11ea367b527e5d558b8dbbe6f3fa1e75146fc5c0c14d0b08659136b0fe11
MD5 3e5dcd6a6ea29c58a83bd7dbc4215c71
BLAKE2b-256 913055d79277a95ec4f55b06085a1ef284d30425cc2f4c03c05d3c61e59232f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page