Skip to main content

A package to quickly train and predict head gestures

Project description

Requirements

  • Anaconda Python >= 3.7

Quickstart

Quickly train 4 gestures for the model to learn. Press the UP, DOWN, RIGHT, and LEFT arrows on your keyboard to 'label' each gesture in realtime. After 30 seconds you'll be prompted to save (append) the new training data. It will immediately show you a cross-validation score of the fitted data. Initialize, Train, and Predict in less than 60 seconds (using your webcam).

import head_controller.db as db
import head_controller.Camera as Camera

# Initialize gesture training data
db.setup_db()

# Capture webcam gestures with live arrow-key labelling
Camera.capture_review_submit_labels()

# Realtime predict webcam gestures
Camera.check_video_frame_data_predict()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for head-controller, version 0.1.9
Filename, size File type Python version Upload date Hashes
Filename, size head_controller-0.1.9-py3.7.egg (18.7 kB) File type Egg Python version 3.7 Upload date Hashes View
Filename, size head_controller-0.1.9.tar.gz (5.4 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page