Skip to main content

Package for eye tracking algorithm allowing for development of gaze controlled computer interface

Project description

EYEGESTURES

EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.

Our Mission!

PyPI - Downloads

💜 Sponsors:

📢📢 We are looking for business partnerships and sponsors! 📢📢

For enterprise avoiding GPL3 licensing there is commercial license!

We offer custom integration and managed services. For businesses requiring invoices message us contact@eyegestures.com.

Sponsor us and we can add your link, banner or other promo materials!

🔨 Projects build with EyeGestures:

Subscribe and get access to our software:

Subscribe on Polar

💻 Install

python3 -m pip install eyeGestures

[!WARNING] some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.

⚙️ Try

python3 examples/simple_example.py
python3 examples/simple_example_v2.py

🔧 Build your own:

Using EyeGesture Engine V2 - Machine Learning Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v2

# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)  
calibrate = True
screen_width = 500
screen_height= 500

# Process each frame
while True:
  ret, frame = cap.read()
  event, cevent = gestures.step(frame,
    calibrate,
    screen_width,
    screen_height,
    context="my_context")

  if event:
    cursor_x, cursor_y = event.point[0], event.point[1]
    fixation = event.fixation
    # calibration_radius: radius for data collection during calibration

Customize:

You can customize your calibration points/map to fit your solutions. Simple copy snippet below, and place your calibration poitns on x,y planes from 0.0 to 1.0. It will be then automatically scaled to your display.

gestures = EyeGestures_v2()
gestures.uploadCalibrationMap([[0,0],[0,1],[1,0],[1,1]])

V2 is two stage tracker. It runs V1 under the hood but then uses it as feature extractor for V2 machine learning component, and combines both outputs to generate new gaze point. It is possible to control how much V1 affects V2 by:

gestures.setClassicImpact(N) # setting N = 2 is working best for my testing 

This makes that sample obtained from V2 is averaged with N times sample from V1 (same sample copied that many times). In outcome having V2 impacting output in 1/N+1 and V1 N/N+1.

It is also worth to know that you can enable hidden calibration for V1 (same calibration when using only V1, but now it is invisible to user):

gestures.enableCNCalib()

Using EyeGesture Engine V1 - Model-Based Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v1

# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()

cap = VideoCapture(0)  
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500

# Obtain estimations from camera frames
event, cevent = gestures.estimate(
    frame,
    "main",
    calibrate,  # set calibration - switch to False to stop calibration
    screen_width,
    screen_height,
    0, 0, 0.8, 10
)

if event:
  cursor_x, cursor_y = event.point[0], event.point[1]
  fixation = event.fixation
  # calibration_radius: radius for data collection during calibration

Feel free to copy and paste the relevant code snippets for your project.

🔥 Web Demos:

rules of using

If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.

📇 Find us:

Troubleshooting:

  1. some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.

💻 Contributors

💵 Support the project

We will be extremely grateful for your support: it helps to keep server running + fuels my brain with coffee.

Support project on Polar (if you want to help we provide access to alphas versions and premium content!): Subscribe on Polar

Star History Chart

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eyegestures-2.7.4.tar.gz (46.1 kB view details)

Uploaded Source

Built Distribution

eyegestures-2.7.4-py3-none-any.whl (49.6 kB view details)

Uploaded Python 3

File details

Details for the file eyegestures-2.7.4.tar.gz.

File metadata

  • Download URL: eyegestures-2.7.4.tar.gz
  • Upload date:
  • Size: 46.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for eyegestures-2.7.4.tar.gz
Algorithm Hash digest
SHA256 4f7f01024bb8123499aeeced695b5e8ad7ae0590c51e9cc099fffa34d2c6a1a0
MD5 991e76efeaeb161531a321c931151239
BLAKE2b-256 9b2fd680134ed7af6f21a7f9731e4f961670e5681f7a7ae7d1d28d7f2ffed149

See more details on using hashes here.

File details

Details for the file eyegestures-2.7.4-py3-none-any.whl.

File metadata

  • Download URL: eyegestures-2.7.4-py3-none-any.whl
  • Upload date:
  • Size: 49.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for eyegestures-2.7.4-py3-none-any.whl
Algorithm Hash digest
SHA256 547dee61676929a605bd333349fad411eefa287d8c4d796a1496c4ff638fbe25
MD5 48c2ffdee770fd73bfbcb6828a14ccb7
BLAKE2b-256 d2def85dab8fa6b65f06fe470ea84d02c32c536433e74fbe3bc9ad593ac8d6ed

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page