Skip to main content

Computer vision toolkit based on TensorFlow

Project description

[![Build Status](https://travis-ci.org/tryolabs/luminoth.svg?branch=master)](https://travis-ci.org/tryolabs/luminoth) [![codecov](https://codecov.io/gh/tryolabs/luminoth/branch/master/graph/badge.svg)](https://codecov.io/gh/tryolabs/luminoth)

> The Dark Visor is a Visor upgrade in Metroid Prime 2: Echoes. Designed by the Luminoth during the war, it was used by the Champion of Aether, A-Kul, to penetrate Dark Aether’s haze in battle against the Ing. > > – [Dark Visor - Wikitroid](http://metroid.wikia.com/wiki/Dark_Visor)

# What is Luminoth?

Luminoth is a computer vision toolkit made with [Tensorflow](https://www.tensorflow.org/) and [Sonnet](https://deepmind.github.io/sonnet/). Our main objective is to create tools and code to easily train and use deep learning models for computer vision problems.

  • Code that is both easy to understand and easy to extend.

  • Out-of-the-box state of the art models.

  • Straightforward implementations with TensorBoard support.

  • Cloud integration for training and deploying.

> DISCLAIMER: This is currently a pre-pre-alpha release, we decided to open-source it up for those inquisive minds that don’t mind getting their hands dirty with rough edges of code.

## Why Luminoth

We started building Luminoth at [Tryolabs](https://tryolabs.com/) after realizing we always ended up rewriting many of the common Tensorflow boilerplate code and models over and over. Instead of just building a cookie-cutter for Tensorflow we started to think about what other features we could benefit from, and how an ideal toolkit would look like.

## Why Tensorflow (and why Sonnet)?

It is indisputable that TensorFlow is currently the most mature Deep Learning framework, and even though we love (truly love) other frameworks as well, especially [PyTorch](http://pytorch.org), our customers demand stable and production ready Machine Learning solutions.

[Sonnet](https://deepmind.github.io/sonnet/) fits perfectly with our mission to build code that is easy to follow and to extend. It is tricky to build a computation graph that is abstract and low-level at the same time to allows us to build complex models, and luckily Sonnet is a library that provides just that.

# Installation Luminoth currently supports Python 2.7 and 3.4–3.6.

If [TensorFlow](https://www.tensorflow.org) and [Sonnet](https://github.com/deepmind/sonnet) are already installed, Luminoth will use those versions.

## Install with CPU support Just run: `bash $ pip install luminoth `

This will install the CPU versions of TensorFlow & Sonnet if you don’t have them.

## Install with GPU support

  1. [Install TensorFlow](https://www.tensorflow.org/install/) with GPU support.

  2. [Install Sonnet](https://github.com/deepmind/sonnet#installation) with GPU support:

    `bash $ pip install dm-sonnet-gpu `

  3. Install Luminoth from PyPI:

    `bash $ pip install luminoth `

## Install from source

First, clone the repo on your machine and then install with pip:

` $ git clone https://github.com/tryolabs/luminoth.git $ cd luminoth $ pip install -e . `

## Check that the installation worked

Simply run lumi –help.

# Supported models

Currently we are focusing on object detection problems, and have a fully functional version of [Faster RCNN](https://arxiv.org/abs/1506.01497). There are more models in progress (SSD and Mask RCNN to name a couple), and we look forward to opening up those implementations.

# Usage

There is one main command line interface which you can use with the lumi command. Whenever you are confused on how you are supposed to do something just type:

lumi –help or lumi <subcommand> –help

and a list of available options with descriptions will show up.

## Datasets

Convert datasets to TensorFlow’s `.tfrecords` for efficient processing using the computation graphs (and for cloud support).

` lumi dataset transform --type pascalvoc --data-dir ~/dataset/pascalvoc/ --output-dir ~/dataset/pascalvoc/tf/ `

` lumi dataset transform --type imagenet --data-dir ~/dataset/imagenet/ --output-dir ~/dataset/imagenet/tf/ `

` lumi dataset transform --type coco --data-dir ~/dataset/coco/ --output-dir ~/dataset/coco/tf/ `

## Training

Check our [TRAINING.md](./TRAINING.md) on how to train locally or in Google Cloud.

## Visualizing results

We strive to get useful and understandable summary and graph visualizations. We consider them to be essential not only for monitoring (duh!), but for getting a broader understanding of whats going under the hood. The same way it is important for code to be understandable and easy to follow, the computation graph should be as well.

By default summary and graph logs are saved to /tmp/luminoth. You can use TensorBoard by running:

` tensorboard --logdir /tmp/luminoth `

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

luminoth-0.0.1.tar.gz (105.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

luminoth-0.0.1-py2.py3-none-any.whl (130.1 kB view details)

Uploaded Python 2Python 3

File details

Details for the file luminoth-0.0.1.tar.gz.

File metadata

  • Download URL: luminoth-0.0.1.tar.gz
  • Upload date:
  • Size: 105.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for luminoth-0.0.1.tar.gz
Algorithm Hash digest
SHA256 b7cc142f931e740d8d91732e0ea8ec789eb27f10b95bfd72437c286d3a469c39
MD5 b2d3e1a29f6bda607631ed92f5d1ccbb
BLAKE2b-256 5816838a6994dc4e25209802ee1b4f2e04bff6bad8f92269295d0d5c26cd7412

See more details on using hashes here.

File details

Details for the file luminoth-0.0.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for luminoth-0.0.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c16e9544c780105ed568c8ec16640b2cba8b6e4d4842be7b771243e866cd9734
MD5 c72a44b1bf9f29814b73879fcf42edde
BLAKE2b-256 3ba3b58be3a54bdae256e344530ce17c55735ad9720476c128222b09a535fdad

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page