Skip to main content

Package for eye tracking algorithm allowing for development of gaze controlled computer interface

Project description

EYEGESTURES

EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.

Our Mission!

PyPI - Downloads

[!IMPORTANT]
EyeGestures is a fully volunteer-based project and exists thanks to your donations and support.

Donation

💜 Sponsors:

📢📢 We are looking for business partnerships and sponsors! 📢📢

For enterprise avoiding GPL3 licensing there is commercial license!

We offer custom integration and managed services. For businesses requiring invoices message us contact@eyegestures.com.

Sponsor us and we can add your link, banner or other promo materials!

🔨 Projects build with EyeGestures:

Subscribe and get access to our software:

Subscribe on Polar

💻 Install

python3 -m pip install eyeGestures

[!WARNING] some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.

⚙️ Try

Tracker works best when your camera or laptop is at arm's length, similar to how you would typically use it. If you are further away, it may be less responsive for now - currently working on solving this issue.

python3 examples/simple_example_v2.py
python3 examples/simple_example.py [legacy tracker, will become obsolete]

🔧 Build your own:

Using EyeGesture Engine V3 - Faster smaller better:

from eyeGestures.utils import VideoCapture
from eyeGestures import EyeGestures_v3

# Initialize gesture engine and video capture
gestures = EyeGestures_v3()
cap = VideoCapture(0)
calibrate = True
screen_width = 500
screen_height= 500

# Process each frame
while True:
  ret, frame = cap.read()
  event, cevent = gestures.step(frame,
    calibrate,
    screen_width,
    screen_height,
    context="my_context")

  if event:
    cursor_x, cursor_y = event.point[0], event.point[1]
    fixation = event.fixation
    saccadess = event.saccadess # saccadess movement detector
    # calibration_radius: radius for data collection during calibration

Using EyeGesture Engine V2 - Machine Learning Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures import EyeGestures_v2

# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)  
calibrate = True
screen_width = 500
screen_height= 500

# Process each frame
while True:
  ret, frame = cap.read()
  event, cevent = gestures.step(frame,
    calibrate,
    screen_width,
    screen_height,
    context="my_context")

  if event:
    cursor_x, cursor_y = event.point[0], event.point[1]
    fixation = event.fixation
    # calibration_radius: radius for data collection during calibration

Customize:

You can customize your calibration points/map to fit your solutions. Simple copy snippet below, and place your calibration poitns on x,y planes from 0.0 to 1.0. It will be then automatically scaled to your display.

gestures = EyeGestures_v2()
gestures.uploadCalibrationMap([[0,0],[0,1],[1,0],[1,1]])

V2 is two stage tracker. It runs V1 under the hood but then uses it as feature extractor for V2 machine learning component, and combines both outputs to generate new gaze point. It is possible to control how much V1 affects V2 by:

gestures.setClassicImpact(N) # setting N = 2 is working best for my testing 

This makes that sample obtained from V2 is averaged with N times sample from V1 (same sample copied that many times). In outcome having V2 impacting output in 1/N+1 and V1 N/N+1.

It is also worth to know that you can enable hidden calibration for V1 (same calibration when using only V1, but now it is invisible to user):

gestures.enableCNCalib()

Using EyeGesture Engine V1 - Model-Based Approach [not recommended]:

from eyeGestures.utils import VideoCapture
from eyeGestures import EyeGestures_v1

# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()

cap = VideoCapture(0)  
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500

# Obtain estimations from camera frames
event, cevent = gestures.estimate(
    frame,
    "main",
    calibrate,  # set calibration - switch to False to stop calibration
    screen_width,
    screen_height,
    0, 0, 0.8, 10
)

if event:
  cursor_x, cursor_y = event.point[0], event.point[1]
  fixation = event.fixation
  # calibration_radius: radius for data collection during calibration

Feel free to copy and paste the relevant code snippets for your project.

🔥 Web Demos:

rules of using

If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.

📇 Find us:

Troubleshooting:

  1. some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.

💻 Contributors

💵 Support the project

We will be extremely grateful for your support: it helps to keep server running + fuels my brain with coffee.

Support project on Polar (if you want to help we provide access to alphas versions and premium content!):

Subscribe on Polar

Star History Chart

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eyegestures-3.2.4.tar.gz (48.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

eyegestures-3.2.4-py3-none-any.whl (51.7 kB view details)

Uploaded Python 3

File details

Details for the file eyegestures-3.2.4.tar.gz.

File metadata

  • Download URL: eyegestures-3.2.4.tar.gz
  • Upload date:
  • Size: 48.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.12

File hashes

Hashes for eyegestures-3.2.4.tar.gz
Algorithm Hash digest
SHA256 e4d57e33abd9d377284151553a4b42bcc354cd671a1039fbd25fbcd81a8f97b0
MD5 6e5ff6366e87e274a3d9324ab818280d
BLAKE2b-256 ded911c7fe2e5f8ece2c15bb5ec61af38caa5621cd4881c6556299584958c95f

See more details on using hashes here.

File details

Details for the file eyegestures-3.2.4-py3-none-any.whl.

File metadata

  • Download URL: eyegestures-3.2.4-py3-none-any.whl
  • Upload date:
  • Size: 51.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.12

File hashes

Hashes for eyegestures-3.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d75f5fbf52da161eae3a70f697a90200375ca3caa4dd3d49b8f03bde7c4f28c6
MD5 e5a93ff7ed29327035e1104f5e26fb65
BLAKE2b-256 1f9de38ef66d7768afdaad6e0c8bc4dfd2037b23a45a53c6c0920f8ccc7600aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page