Package for eye tracking algorithm allowing for development of gaze controlled computer interface
Project description
EYEGESTURES
EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eyetracking and eyedriven interfaces without requirement of obtaining expensive hardware.
Our Mission!
💜 Sponsors:
For enterprise avoiding GPL3 licensing there is commercial license!
We offer custom integration and managed services. For businesses requiring invoices message us contact@eyegestures.com
.
Sponsor us and we can add your link, banner or other promo materials!
💻 Install
python3 -m pip install eyeGestures
⚙️ Run
python3 examples/simple_example.py
🪟 Run Windows App
python3 apps/win_app.py
Or download it from releases
🔧 How to use [WiP - adding Enginge V2]:
Using EyeGesture Engine V2 - Machine Learning Approach:
from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v2
# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)
# Process each frame
point, calibration_point, blink, fixation, acceptance_radius, calibration_radius = gestures.step(frame, calibrate, screen_width, screen_height)
# point: x, y positions of cursor
# calibration_point: x, y position of current calibration point
# acceptance_radius: precision required for calibration
# calibration_radius: radius for data collection during calibration
Using EyeGesture Engine V1 - Model-Based Approach:
from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v1
# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()
cap = VideoCapture(0)
ret, frame = cap.read()
# Obtain estimations from camera frames
event = gestures.estimate(
frame,
"main",
True, # set calibration - switch to False to stop calibration
screen_width,
screen_height,
0, 0, 0.8, 10
)
cursor_x, cursor_y = event.point[0], event.point[1]
Feel free to copy and paste the relevant code snippets for your project.
🔥 Web Demos:
rules of using
If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.
Promo Materials:
https://github.com/NativeSensors/EyeGestures/assets/40773550/4ca842b9-ba32-4ffd-b2e4-179ff67ee47f
https://github.com/NativeSensors/EyeGestures/assets/40773550/6a7c74b5-b069-4eec-bc96-3a6bb4159b37
📇 Find us:
- RSS
- discord
- email: contact@eyegestures.com
Follow us on polar (it costs nothing but you help project!):
📢 Announcements:
💻 Contributors
💵 Support the project
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for eyegestures-2.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 041709bc0004df6bb06d8781e37f48f58b7524902d39982e475758a48ce255df |
|
MD5 | 155e1f790001d1f38c7b30d897706c78 |
|
BLAKE2b-256 | 66ab7098f0dffaf46ef8271f8ce61abfacf0e09b8471f8b9ba2b00c9474dbfe6 |