Skip to main content

The Nanonets API

Project description

The NanoNets API Python Wrapper


Nanonets provides an easy to use API to communicate with it's servers and build machine learning models and make predictions on image data. The models that can be built are -

  1. Image Classification
  2. Multi-label Classification
  3. Object Detection
  4. OCR

Check us out at https://nanonets.com.
To find out about our GUI solution or to get your API key, check out https://app.nanonets.com


Installation

Pip install -

Run the following command from your terminal -

pip install nanonets

Setuptools -

To install using setuptools, run the following commands from your terminal

git clone https://github.com/NanoNets/nanonets-python-wrapper.git
cd nanonets-python-wrapper
python setup.py install --user

Create Models -

To create a new model

  1. Head over to https://app.nanonets.com, login, click on the 'API Keys' tab on the left.
  2. Click on 'COPY KEY'
  3. Create a model using the following python code
from nanonets import ImageClassification

# from nanonets import MultilabelClassification
# from nanonets import ObjectDetection
# from nanonets import OCR

api_key = 'YOUR_API_KEY_GOES_HERE'
categories = ['list', 'of', 'your', 'labels']

model = ImageClassification(api_key, categories)

This will print a Model ID that you should note down for future reference. You can also find this out by visiting https://app.nanonets.com


Preparing training data

The training data, needs to be put into a dictionary format where

  • for image classification models -
    • keys (str) - filepaths/urls of images
    • values (str) - labels for each image
  • for multilabel classification models -
    • keys (str) - filepaths/urls of images
    • values (List[str]) - list of labels for each image
  • for object detection models -
    • keys (str) - filepaths of images
    • values (str) - annotation paths for each image (XML or JSON)
  • for OCR models -
    • keys (str) - filepaths of images
    • values (str) - annotation paths for each image (XML or JSON)

you can look into the data/annotations directory to get a better idea.


Training and model status

To train a model on some training data -

model.train(training_dict, data_path_type='files')

The images will get uploaded and the training will get initialised after.

You can check if the model is trained or not at anytime by running the following command from a python console -

api_key = 'YOUR_API_KEY_GOES_HERE'
categories = ['list', 'of', 'your', 'labels']
model_id = 'YOUR_MODEL_ID'

model = ImageClassification(api_key, categories, model_id=model_id)
model._check_model_state()

Or you can wait for Nanonets to email you once the training is finished.


Inference

You can run inference on a single image or multiple images. You can use urls as well as local files.

api_key = 'YOUR_API_KEY_GOES_HERE'
categories = ['list', 'of', 'your', 'labels']
model_id = 'YOUR_MODEL_ID'

model = ImageClassification(api_key, categories, model_id=model_id)

## list of file paths of several test images
img_dir = 'PATH_TO_YOUR_IMAGES_DIR'
imglist = os.listdir(img_dir)
imglist = [img_dir + '/' + x for x in imglist]

## list of urls of several test images
urls = ['LIST', 'OF', 'YOUR', 'IMAGE', 'URLS']

## prediction functions for single file
resp_one_file = model.predict_for_file(imglist[0])
print("IC response - single image: ", resp_one_file)

## prediction functions for multiple files
resp_mul_files = model.predict_for_files(imglist)
print("IC response - multiple images: ", resp_mul_files)

## prediction functions for single url
resp_one_url = model.predict_for_url(urls[0])
print("IC response - single URL: ", resp_one_url)

## prediction functions for multiple urls
resp_mul_urls = model.predict_for_urls(urls)
print("IC response - multiple URLs: ", resp_mul_urls)

NOTE: The data in the data directory is meant to serve as examples and is generated randomly to demonstrate training and inference code in the examples directory. Use said data to understand the format requirements for the library and not for building models.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nanonets-2.0.5.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

nanonets-2.0.5-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file nanonets-2.0.5.tar.gz.

File metadata

  • Download URL: nanonets-2.0.5.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for nanonets-2.0.5.tar.gz
Algorithm Hash digest
SHA256 a5a1caf5dfd25afca666adf27f8e455e909f970918ebccb70a6dd14f14e2297b
MD5 f5871f2844c822da9da59e911283e5ad
BLAKE2b-256 d08e085adcdadf04e207da3dafd7987eeee7d40c75d64893839b7bd38a56d1e4

See more details on using hashes here.

File details

Details for the file nanonets-2.0.5-py3-none-any.whl.

File metadata

  • Download URL: nanonets-2.0.5-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for nanonets-2.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 40ea4e602067b77654671a56f5213d9bce1cda726dbcd4b812726c25d455ea25
MD5 20a5f5e96f54c7e5dbb3ae81cb16863b
BLAKE2b-256 86880cf1ea4ac888c994a25c61101a74c84025392bf64c0d18df2e588a484833

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page