Skip to main content

API for accessing HSI datasets

Project description

Install

pip install HSI-Dataset-API

Links to the available HSI datasets

Dataset structure

Dataset should be stored in the following structure:

Plain structure (#1)

{dataset_name}
├── hsi
│   ├── 1.npy
│   └── 1.yml
├── masks
│   └── 1.png
└── meta.yaml

Or in structure like this (such structure was created while using data cropping)

Cropped data structure (#2)

{dataset_name}
├── hsi
│   ├── specter_1
│   │   ├── 1.npy
│   │   ├── 1.yml
│   │   ├── 2.npy
│   │   └── 2.yml
│   └── specter_2
│       ├── 1.npy
│       └── 1.yml
├── masks
│   ├── specter_1
│   │   ├── 1.png
│   │   └── 2.png
│   └── specter_2
│       └── 1.png
└── meta.yaml

Meta.yml

In this file you should provide classes description (it's name and label). Also, you can store any helpful information that describes the dataset.

For example:

name: HSI Dataset example
description: Some additional info about dataset
classes:
  cat: 1
  dog: 2
  car: 3
wave_lengths:
- 420.0
- 640.0
- 780.0 

{number}.yml

In this file you can store HSI specific information such as date, name of humidity.

For example:

classes:
  - potato
height: 512
width: 512
layersCount: 237
original_filename: '210730_134940_'
top_left:
  - 0
  - 0

Python API

Via API presented in this repo you can access the dataset.

Importing

from hsi_dataset_api import HsiDataset, HsiDataCropper

Cropping the data

base_path = '/mnt/data/corrected_hsi_data'
output_path = '/mnt/data/cropped_hsi_data'
classes = ['potato', 'tomato']
selected_folders = ['HSI_1', 'HSI_2']  # Completely optional

cropper = HsiDataCropper(side_size=512, step=8, objects_ratio=0.20, min_class_ratio=0.01)
cropper.crop(base_path, output_path, classes, selected_folders)

Plot cropped data statistics

cropper.draw_statistics()

Using the data

Create Data Access Object

dataset = HsiDataset('../example/dataset_example', cropped_dataset=False)

Parameter cropped_dataset controls type of the dataset structure. If the dataset persist in the memory in the structure like second (#2) - set this parameter to True

Getting the dataset meta information

dataset.get_dataset_description()

Getting the shuffled train data using python generator

for data_point in dataset.data_iterator(opened=True, shuffle=True):
    hyperspecter = data_point.hsi
    mask = data_point.mask
    meta = data_point.meta

Examples

See jupyter notebook example by the following link:

https://nbviewer.org/github/Banayaki/hsi_dataset_api/blob/master/examples/ClassificationMLP.ipynb

Source code

Source code is available:

https://github.com/Banayaki/hsi_dataset_api

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

HSI_Dataset_API-1.5.2.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

HSI_Dataset_API-1.5.2-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file HSI_Dataset_API-1.5.2.tar.gz.

File metadata

  • Download URL: HSI_Dataset_API-1.5.2.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.5

File hashes

Hashes for HSI_Dataset_API-1.5.2.tar.gz
Algorithm Hash digest
SHA256 e9dff0f24664ecf98f80f2880dc1ef7fdbfefdd6c359acc0566622c875efd3b1
MD5 f293873002a60612ae5548e37982881a
BLAKE2b-256 0728db3b35e79898562e0dca3147003b299812d94edc1920433c42eb7264087a

See more details on using hashes here.

File details

Details for the file HSI_Dataset_API-1.5.2-py3-none-any.whl.

File metadata

  • Download URL: HSI_Dataset_API-1.5.2-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.5

File hashes

Hashes for HSI_Dataset_API-1.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 11e52be776039e9941eda31d51524f67e18747d29e2c03df9b6c6334f72ba9a8
MD5 bc9d49f24ebfa535bed74913a13a8fb3
BLAKE2b-256 b978d30dd5c4e1c5e85333891366f8a7f33373489751f309acdcaf295b94150f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page