Skip to main content

API for accessing HSI datasets

Project description

Install

pip install HSI-Dataset-API

Links to the available HSI datasets

Dataset structure

Dataset should be stored in the following structure:

Plain structure (#1)

{dataset_name}
├── hsi
│   ├── 1.npy
│   └── 1.yml
├── masks
│   └── 1.png
└── meta.yaml

Or in structure like this (such structure was created while using data cropping)

Cropped data structure (#2)

{dataset_name}
├── hsi
│   ├── specter_1
│   │   ├── 1.npy
│   │   ├── 1.yml
│   │   ├── 2.npy
│   │   └── 2.yml
│   └── specter_2
│       ├── 1.npy
│       └── 1.yml
├── masks
│   ├── specter_1
│   │   ├── 1.png
│   │   └── 2.png
│   └── specter_2
│       └── 1.png
└── meta.yaml

Meta.yml

In this file you should provide classes description (it's name and label). Also, you can store any helpful information that describes the dataset.

For example:

name: HSI Dataset example
description: Some additional info about dataset
classes:
  cat: 1
  dog: 2
  car: 3
wave_lengths:
- 420.0
- 640.0
- 780.0 

{number}.yml

In this file you can store HSI specific information such as date, name of humidity.

For example:

classes:
  - potato
height: 512
width: 512
layersCount: 237
original_filename: '210730_134940_'
top_left:
  - 0
  - 0

Python API

Via API presented in this repo you can access the dataset.

Importing

from hsi_dataset_api import HsiDataset, HsiDataCropper

Cropping the data

base_path = '/mnt/data/corrected_hsi_data'
output_path = '/mnt/data/cropped_hsi_data'
classes = ['potato', 'tomato']
selected_folders = ['HSI_1', 'HSI_2']  # Completely optional

cropper = HsiDataCropper(side_size=512, step=8, objects_ratio=0.20, min_class_ratio=0.01)
cropper.crop(base_path, output_path, classes, selected_folders)

Plot cropped data statistics

cropper.draw_statistics()

Using the data

Create Data Access Object

dataset = HsiDataset('../example/dataset_example', cropped_dataset=False)

Parameter cropped_dataset controls type of the dataset structure. If the dataset persist in the memory in the structure like second (#2) - set this parameter to True

Getting the dataset meta information

dataset.get_dataset_description()

Getting the shuffled train data using python generator

for data_point in dataset.data_iterator(opened=True, shuffle=True):
    hyperspecter = data_point.hsi
    mask = data_point.mask
    meta = data_point.meta

Examples

See jupyter notebook example by the following link:

https://nbviewer.org/github/Banayaki/hsi_dataset_api/blob/master/examples/ClassificationMLP.ipynb

Source code

Source code is available:

https://github.com/Banayaki/hsi_dataset_api

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

HSI_Dataset_API-1.5.1.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

HSI_Dataset_API-1.5.1-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file HSI_Dataset_API-1.5.1.tar.gz.

File metadata

  • Download URL: HSI_Dataset_API-1.5.1.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.5

File hashes

Hashes for HSI_Dataset_API-1.5.1.tar.gz
Algorithm Hash digest
SHA256 6dbd096a8dd794d1609a82b1d119c0b74476ec4ee4c570093141c9646f1e6c11
MD5 f900267a6b56d472d4a001aeb1726b31
BLAKE2b-256 0c566f157a9e59cd53b9da86c52026c0beec0bb373be75d21058e55376f6e82f

See more details on using hashes here.

File details

Details for the file HSI_Dataset_API-1.5.1-py3-none-any.whl.

File metadata

  • Download URL: HSI_Dataset_API-1.5.1-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.5

File hashes

Hashes for HSI_Dataset_API-1.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fdbf8135f598cfab88e8cbaf243a5880a68c81437a417cc0a5bbf59d7b4e0b0f
MD5 0f1ab398b9e83a7cc162bebdf339ece4
BLAKE2b-256 a0dcd1112d6e3667310c23626082412c5ced76621ac68ae06c93c8ae8c669778

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page