Skip to main content

DeepBach implementation for harmonization

Project description

deepbach-pytorch

pip install deepbach-pytorch==0.3.6

For detailed usage instructions for the package api, please refer to test.py

DeepBach

This repository contains implementations of the DeepBach model described in

DeepBach: a Steerable Model for Bach chorales generation
Gaëtan Hadjeres, François Pachet, Frank Nielsen
ICML 2017 arXiv:1612.01010

The code uses python 3.9 together with PyTorch v2.0 and music21 libraries.

For the original Keras version, please checkout the original_keras branch.

Examples of music generated by DeepBach are available on this website

Models, Dataset caches and Deployment script are available on Google Drive

Installation

You can clone this repository, install dependencies using Anaconda and download a pretrained model together with a dataset
with the following commands:

git clone git@github.com:Ghadjeres/DeepBach.git
cd DeepBach
conda env create --name deepbach_pytorch -f environment.yml
bash dl_dataset_and_models.sh

This will create a conda env named deepbach_pytorch.

music21 editor

You might need to Open a four-part chorale. Press enter on the server address, a list of computed models should appear. Select and (re)load a model. Configure properly the music editor called by music21. On Ubuntu you can eg. use MuseScore:

sudo apt install musescore
python -c 'import music21; music21.environment.set("musicxmlPath", "/usr/bin/musescore")'

For usage on a headless server (no X server), just set it to a dummy command:

python -c 'import music21; music21.environment.set("musicxmlPath", "/bin/true")'

Usage

Usage: deepBach.py [OPTIONS]

Options:
  --note_embedding_dim INTEGER    size of the note embeddings
  --meta_embedding_dim INTEGER    size of the metadata embeddings
  --num_layers INTEGER            number of layers of the LSTMs
  --lstm_hidden_size INTEGER      hidden size of the LSTMs
  --dropout_lstm FLOAT            amount of dropout between LSTM layers
  --linear_hidden_size INTEGER    hidden size of the Linear layers
  --batch_size INTEGER            training batch size
  --num_epochs INTEGER            number of training epochs
  --train                         train or retrain the specified model
  --num_iterations INTEGER        number of parallel pseudo-Gibbs sampling
                                  iterations
  --sequence_length_ticks INTEGER
                                  length of the generated chorale (in ticks)
  --help                          Show this message and exit.

Examples

You can generate a four-bar chorale with the pretrained model and display it in MuseScore by simply running

python deepBach.py

You can train a new model from scratch by adding the --train flag.

Usage with NONOTO

The command

python flask_server.py

starts a Flask server listening on port 5000. You can then use NONOTO to compose with DeepBach in an interactive way.

This server can also been started using Docker with:

docker run -p 5000:5000 -it --rm ghadjeres/deepbach

(CPU version), with or

docker run --runtime=nvidia -p 5000:5000 -it --rm ghadjeres/deepbach

(GPU version, requires nvidia-docker.

Usage within MuseScore

Deprecated

Put deepBachMuseScore.qml file in your MuseScore plugins directory, and run

python musescore_flask_server.py

Open MuseScore and activate deepBachMuseScore plugin using the Plugin manager. You can then click on the Compose button without any selection to create a new chorale from scratch. You can then select a region in the chorale score and click on the Compose button to regenerated this region using DeepBach.

Issues

Music21 editor not set

music21.converter.subConverters.SubConverterException: Cannot find a valid application path for format musicxml. Specify this in your Environment by calling environment.set(None, '/path/to/application')

Either set it to MuseScore or similar (on a machine with GUI) to to a dummy command (on a server). See the installation section.

Citing

Please consider citing this work or emailing me if you use DeepBach in musical projects.

@InProceedings{pmlr-v70-hadjeres17a,
  title = 	 {{D}eep{B}ach: a Steerable Model for {B}ach Chorales Generation},
  author = 	 {Ga{\"e}tan Hadjeres and Fran{\c{c}}ois Pachet and Frank Nielsen},
  booktitle = 	 {Proceedings of the 34th International Conference on Machine Learning},
  pages = 	 {1362--1371},
  year = 	 {2017},
  editor = 	 {Doina Precup and Yee Whye Teh},
  volume = 	 {70},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {International Convention Centre, Sydney, Australia},
  month = 	 {06--11 Aug},
  publisher = 	 {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v70/hadjeres17a/hadjeres17a.pdf},
  url = 	 {http://proceedings.mlr.press/v70/hadjeres17a.html},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepbach_pytorch-0.4.3.tar.gz (46.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepbach_pytorch-0.4.3-py3-none-any.whl (53.0 kB view details)

Uploaded Python 3

File details

Details for the file deepbach_pytorch-0.4.3.tar.gz.

File metadata

  • Download URL: deepbach_pytorch-0.4.3.tar.gz
  • Upload date:
  • Size: 46.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for deepbach_pytorch-0.4.3.tar.gz
Algorithm Hash digest
SHA256 40458a89f62c2fe0b8002bde4bddd1b05814c6d816ffcaf2d012904724a7823d
MD5 d063fe6a2165708e19e9797164317df5
BLAKE2b-256 9742cf88514ea82b536a922d67f4fb0276b03cf0987d074c6ce8e8a91c6cf93a

See more details on using hashes here.

File details

Details for the file deepbach_pytorch-0.4.3-py3-none-any.whl.

File metadata

File hashes

Hashes for deepbach_pytorch-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 00845492e6f36439cb8487838c07599149a6681a26dc47e1d52a3e6eabe19232
MD5 bd190ba26ad10e831ac920a57b94465b
BLAKE2b-256 74a7603ef61ebdb5fcf07d392ad01407427149d763437778d0a90c7c16bc063e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page