Skip to main content

No project description provided

Project description

Bonito

PyPI version py36 py37 py38 py39 cu102 cu111 cu113

A PyTorch Basecaller for Oxford Nanopore Reads.

$ pip install ont-bonito
$ bonito basecaller dna_r10.4_e8.1_sup@v3.4 /data/reads > basecalls.bam

Bonito supports writing aligned/unaligned {fastq, sam, bam, cram}.

$ bonito basecaller dna_r10.4_e8.1_sup@v3.4 --reference reference.mmi /data/reads > basecalls.bam

Bonito will download and cache the basecalling model automatically on first use but all models can be downloaded with -

$ bonito download --models --show  # show all available models
$ bonito download --models         # download all available models

The default ont-bonito package is built against CUDA 10.2 however CUDA 11.1 and 11.3 builds are available.

$ pip install -f https://download.pytorch.org/whl/torch_stable.html ont-bonito-cuda111

Modified Bases

Modified base calling is handled by Remora.

$ bonito basecaller dna_r10.4_e8.1_sup@v3.4 /data/reads --modified-bases 5mC --reference ref.mmi > basecalls_with_mods.bam

To see the available models with the remora model list_pretrained command.

Training your own model

To train a model using your own reads, first basecall the reads with the additional --save-ctc flag and use the output directory as the input directory for training.

$ bonito basecaller dna_r9.4.1 --save-ctc --reference reference.mmi /data/reads > /data/training/ctc-data/basecalls.sam
$ bonito train --directory /data/training/ctc-data /data/training/model-dir

In addition to training a new model from scratch you can also easily fine tune one of the pretrained models.

bonito train --epochs 1 --lr 5e-4 --pretrained dna_r10.4_e8.1_sup@v3.4 --directory /data/training/ctc-data /data/training/fine-tuned-model

If you are interested in method development and don't have you own set of reads then a pre-prepared set is provide.

$ bonito download --training
$ bonito train /data/training/model-dir

All training calls use Automatic Mixed Precision to speed up training. To disable this, set the --no-amp flag to True.

Developer Quickstart

$ git clone https://github.com/nanoporetech/bonito.git  # or fork first and clone that
$ cd bonito
$ python3 -m venv venv3
$ source venv3/bin/activate
(venv3) $ pip install --upgrade pip
(venv3) $ pip install -r requirements.txt
(venv3) $ python setup.py develop

Interface

  • bonito view - view a model architecture for a given .toml file and the number of parameters in the network.
  • bonito train - train a bonito model.
  • bonito evaluate - evaluate a model performance.
  • bonito download - download pretrained models and training datasets.
  • bonito basecaller - basecaller (.fast5 -> .bam).

References

Licence and Copyright

(c) 2019 Oxford Nanopore Technologies Ltd.

Bonito is distributed under the terms of the Oxford Nanopore Technologies, Ltd. Public License, v. 1.0. If a copy of the License was not distributed with this file, You can obtain one at http://nanoporetech.com

Research Release

Research releases are provided as technology demonstrators to provide early access to features or stimulate Community development of tools. Support for this software will be minimal and is only provided directly by the developers. Feature requests, improvements, and discussions are welcome and can be implemented by forking and pull requests. However much as we would like to rectify every issue and piece of feedback users may have, the developers may have limited resource for support of this software. Research releases may be unstable and subject to rapid iteration by Oxford Nanopore Technologies.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ont-bonito-cuda111-0.5.0a0.tar.gz (48.3 kB view details)

Uploaded Source

File details

Details for the file ont-bonito-cuda111-0.5.0a0.tar.gz.

File metadata

  • Download URL: ont-bonito-cuda111-0.5.0a0.tar.gz
  • Upload date:
  • Size: 48.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.9.4

File hashes

Hashes for ont-bonito-cuda111-0.5.0a0.tar.gz
Algorithm Hash digest
SHA256 948cc5c23d7eb877caef805249d0bb252f75443a240d4880cd2c2db3f37a574f
MD5 b870433888ce0a3cba0528c0236fdcf7
BLAKE2b-256 79bf17eb8226b121bc06d06717d650351a8527a5bd40e91786b3761a79249bf8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page