Skip to main content

Hyperspectral data analysis and machine learning

Project description

# hypers
[![Build Status](https://travis-ci.com/priyankshah7/scikit-hyper.svg?token=xX99xZvXU9jWErT5D1zh&branch=master)](https://travis-ci.com/priyankshah7/scikit-hyper)
[![Documentation Status](https://readthedocs.org/projects/scikit-hyper/badge/?version=latest)](http://scikit-hyper.readthedocs.io/en/latest/?badge=latest)
[![Python Version 3.5](https://img.shields.io/badge/Python-3.5-blue.svg)](https://www.python.org/downloads/)
[![Python Version 3.6](https://img.shields.io/badge/Python-3.6-blue.svg)](https://www.python.org/downloads/)
[![PyPI version](https://badge.fury.io/py/scikit-hyper.svg)](https://badge.fury.io/py/scikit-hyper)

Provides an object model for hyperspectral data.

+ Simple tools for exploratory analysis of hyperspectral data
+ Interactive hyperspectral viewer built into the object
+ Allows for unsupervised machine learning directly on the object (using scikit-learn)
+ More features coming soon...

<p align="center"><img src="/docs/images/hyperspectral_image.png" width="300"></p>

## Contents
1. [About](#about)
1. [Installation](#installation)
2. [Features](#features)
3. [Examples](#examples)
4. [Documentation](#documentation)
5. [License](#license)

## About
This package provides an object model for hyperspectral data (e.g. similar to pandas for tabulated data). Many of the
commonly used tools are built into the object, including a lightweight interactive gui for visualizing the data.
Importantly, the object also interfaces with `scikit-learn` to allow the cluser and decomposition classes (e.g. PCA,
ICA, K-means) to be used directly with the object.

+ [Dataset object](http://scikit-hyper.readthedocs.io/en/latest/source/Dataset/index.html) (`hypers.Dataset`)

This class forms the core of hypers. It provides useful information about the
hyperspectral data and makes machine learning on the data simple.

+ [Interactive hyperspectral viewer](http://hypers.readthedocs.io/en/latest/source/hypview/index.html)

A lightweight pyqt gui that provides an interative interface to view the
hyperspectral data.

<p align="center"><img src="/docs/source/hypview/hyperspectral_view.png" width="400"></p>

**Please note that this package is currently in pre-release. The first general release will
be v0.1.0**

#### Hyperspectral data
Whilst this package is designed to work with any type of hyperspectral data, of the form:

**X[x, y, spectrum]** or
**X[x, y, z, spectrum]**

some of the features are particularly useful for vibrational-scattering related hyperspectral data (e.g. Raman micro-spectroscopy), e.g. the spectral component of the hyperspectral viewer (see figure above).


## Installation
To install using `pip`:
```
pip install hypers
```

The following packages are required:

+ numpy
+ scipy
+ scikit-learn
+ PyQt5
+ pyqtgraph

## Features
Features implemented in ``hypers`` include:

+ [Clustering](http://hypers.readthedocs.io/en/latest/source/cluster/index.html) (e.g. KMeans, Spectral clustering, Hierarchical clustering)
+ [Decomposition](http://hypers.readthedocs.io/en/latest/source/decomposition/index.html) (e.g. PCA, ICA, NMF)
+ [Hyperspectral viewer](http://hypers.readthedocs.io/en/latest/source/hypview/index.html)


## Examples

### Hyperspectral dimensionality reduction and clustering
Below is a quick example of using some of the features of the package on a randomized hyperspectral array. For an example using the IndianPines dataset, see the Jupyter notebook in the examples/ directory.

```python
import numpy as np
import hypers as hp
from sklearn.decomposition import PCA
from sklearn.cluster import KMeans

# Generating a random 4-d dataset and creating a Process instance
# The test dataset here has spatial dimensions (x=200, y=200, z=10) and spectral dimension (s=1024)
test_data = np.random.rand(200, 200, 10, 1024)
X = hp.Dataset(test_data)

# Using Principal Components Analysis to reduce to first 5 components
# The variables ims, spcs are arrays of the first 5 principal components for the images, spectra respectively
ims, spcs = X.decompose(
mdl=PCA(n_components=5)
)

# Clustering using K-means (with and without applying PCA first)
# The cluster method will return the labeled image array and the spectrum for each cluster
lbls_nodecompose, spcs_nodecompose = X.cluster(
mdl=KMeans(n_clusters=3),
decomposed=False
)

# Clustering on only the first 5 principal components
lbls_decomposed, spcs_decomposed = X.cluster(
mdl=KMeans(n_clusters=3),
decomposed=True,
pca_comps=5
)
```

## Documentation
The docs are hosted [here](http://hypers.readthedocs.io/en/latest/?badge=latest).

## License
hypers is licensed under the OSI approved BSD 3-Clause License.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hypers-0.0.9.tar.gz (9.3 kB view details)

Uploaded Source

Built Distribution

hypers-0.0.9-py3-none-any.whl (13.1 kB view details)

Uploaded Python 3

File details

Details for the file hypers-0.0.9.tar.gz.

File metadata

  • Download URL: hypers-0.0.9.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/28.8.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.6.4

File hashes

Hashes for hypers-0.0.9.tar.gz
Algorithm Hash digest
SHA256 740f153a45165d82a437794db83750fba53eba951f0480fda26a758294a39753
MD5 7fd7240fdfe31ff2e25515b3325ed88f
BLAKE2b-256 d3bcba7a56a6587b0f1fcf6e8195ba7b5d077a6eb7f05f1bbd6472ab6ac75b57

See more details on using hashes here.

File details

Details for the file hypers-0.0.9-py3-none-any.whl.

File metadata

  • Download URL: hypers-0.0.9-py3-none-any.whl
  • Upload date:
  • Size: 13.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/28.8.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.6.4

File hashes

Hashes for hypers-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 33215be21bc276fc69de0b0a3d5fb31975a439878983c358f3443aa22e4bc504
MD5 66a85ccd2e7e28ed154c30e1abcfa48f
BLAKE2b-256 ec3df330fd74c84ca43fa8e1d72a1fee5ffe761a53abbec1342319ca4149e7c1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page