Skip to content

An assistive technology tool powered by AI (YOLO) that converts head rotation into keyboard inputs, enabling hands-free gaming for people with motor disabilities. ๐Ÿ‘๏ธ๐ŸŽฎ

Notifications You must be signed in to change notification settings

PichuFV/Playable

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

9 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Playable ๐Ÿ‘๏ธ๐ŸŽฎ

Gaming for Everyone. An AI-powered assistive tool that allows users to control games using only their eye movements.

๐Ÿ’ก About the Project

Playable was developed as a university project with a clear mission: to make gaming accessible to people with motor disabilities or limited mobility.

Using advanced Computer Vision and Deep Learning, the system captures the user's webcam feed, analyzes their gaze direction in real-time, and simulates the corresponding keyboard presses (WASD or Arrows). This creates a hands-free interface that can control virtually any game or software.

๐Ÿ› ๏ธ Tech Stack

Python YOLO OpenCV PyAutoGUI

โœจ Key Features

๐Ÿง  AI Gaze Detection

  • Powered by a custom-trained YOLO (You Only Look Once) model located in classification/best.pt.
  • Classifies gaze into 5 distinct states: Up, Down, Left, Right, Center (Neutral).
  • High-performance inference tailored for real-time usage.

๐ŸŽฎ Seamless Control

  • Keyboard Simulation: Uses PyAutoGUI to translate gaze classification into physical key presses.
  • Configurable Inputs: Designed to map to standard arrow keys or custom configurations.
  • Low Latency: Optimized pipeline to minimize delay between eye movement and game action.

๐Ÿ–ฅ๏ธ Architecture

  • Object Detection Module: Initial version using detection bounding boxes (object_detection/playable_yolo_v1.py).
  • Classification Module: Enhanced version using direct classification for better accuracy (classification/playable_yolo_v2.py).
  • Main App: The central controller that integrates the model with the user interface (main_app.py).

๐Ÿš€ How to Run

  1. Clone the repository
git clone [https://github.com/USERNAME/playable.git](https://github.com/USERNAME/playable.git)
cd playable
  1. nstall Dependencies It is recommended to use a virtual environment.
pip install -r requirements.txt

Note: Requires ultralytics, opencv-python, and pyautogui.

Run the Application

python main_app.py

Ensure your webcam is connected and positioned at eye level.




Developed by: @nathanhgo @iamthewalrusz @PichuFV

About

An assistive technology tool powered by AI (YOLO) that converts head rotation into keyboard inputs, enabling hands-free gaming for people with motor disabilities. ๐Ÿ‘๏ธ๐ŸŽฎ

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages