Gaming for Everyone. An AI-powered assistive tool that allows users to control games using only their eye movements.
Playable was developed as a university project with a clear mission: to make gaming accessible to people with motor disabilities or limited mobility.
Using advanced Computer Vision and Deep Learning, the system captures the user's webcam feed, analyzes their gaze direction in real-time, and simulates the corresponding keyboard presses (WASD or Arrows). This creates a hands-free interface that can control virtually any game or software.
- Powered by a custom-trained YOLO (You Only Look Once) model located in
classification/best.pt. - Classifies gaze into 5 distinct states: Up, Down, Left, Right, Center (Neutral).
- High-performance inference tailored for real-time usage.
- Keyboard Simulation: Uses
PyAutoGUIto translate gaze classification into physical key presses. - Configurable Inputs: Designed to map to standard arrow keys or custom configurations.
- Low Latency: Optimized pipeline to minimize delay between eye movement and game action.
- Object Detection Module: Initial version using detection bounding boxes (
object_detection/playable_yolo_v1.py). - Classification Module: Enhanced version using direct classification for better accuracy (
classification/playable_yolo_v2.py). - Main App: The central controller that integrates the model with the user interface (
main_app.py).
- Clone the repository
git clone [https://github.com/USERNAME/playable.git](https://github.com/USERNAME/playable.git)
cd playable- nstall Dependencies It is recommended to use a virtual environment.
pip install -r requirements.txtNote: Requires ultralytics, opencv-python, and pyautogui.
Run the Application
python main_app.py
Ensure your webcam is connected and positioned at eye level.
Developed by: @nathanhgo @iamthewalrusz @PichuFV