Skip to main content

Classifying model for Sign Language Recognition System

Project description

gesture-classifying-model
----------------

To use, first create GestureClassifier object::

>>> predictor = GestureClassifier()

This may take a while as model is being downloaded and loaded (approx. 86 MB of weights).

After creation, you can use the predictor, by passing tensor of frames to transform function.
Numpy array of passed images should have same number of channels, same width and height.
GestureClassifier receives tensor in following shape (None, 64, 64, 3). By that I mean::

>>> input_tensor.shape
>>> (any_num, 64, 64, 3),

where 'any_num' is any number of frames passed. Images should be in range of values 0-255.

You can finally predict probabilities of frame of cropped hands (letters are indexed in order shown below)::

>>> predictor.transform(input_tensor_of images)

Letters order::

'abcdefghiklmnopqrstuvwxy' # alphabetical order for indices from 1 to 24 without letters requiring movement, like 'z' and 'j'

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gesture_classifying_model-0.1.10.tar.gz (3.2 kB view details)

Uploaded Source

File details

Details for the file gesture_classifying_model-0.1.10.tar.gz.

File metadata

File hashes

Hashes for gesture_classifying_model-0.1.10.tar.gz
Algorithm Hash digest
SHA256 ddbb20dc2f23659b6ca6803167f52d39b2e0c5560621d631557eb33d67873e31
MD5 9793b34073d603b2f1a386069b8b032c
BLAKE2b-256 03f9d415a3012dc53ee2f0eed7fa2eb155f94b95fe8580ad19133099b21c9488

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page