This project implements a gesture-based control system for gaming using a webcam. By tracking body landmarks with MediaPipe's pose estimation model, users can trigger game actions like move forward, hard kick, upper punch, and high punch with simple body gestures. No hardware sensors or controllers are needed—just your webcam and some moves!
Insert gif or link to demo (COMMING SOON)
| Gesture | Action Triggered | Key Pressed |
|---|---|---|
| 🤝 Join Hands | Start control mode | — |
| 🦵 Leg spread > 0.40 | Hard Kick | J |
| ➡️ Foot x-distance > 0.20 | Move Forward | D |
| 🙌 Wrist above shoulder | Upper Punch | L |
| 👉 Arm extended sideways | High Punch | I |
Uses MediaPipe Pose to detect landmarks (wrists, ankles, shoulders).
Calculates distances and positions to interpret gestures.
Uses Python's keyboard library to simulate keypresses mapped to gaming actions.
Displays the current detected move on-screen using OpenCV.
- Clone the repository:
git clone https://github.com/SUJALGOYALL/Gesture_game.git
cd Gesture_game- Install dependencies: Make sure you’re using Python 3.7+
pip install opencv-python mediapipe numpy keyboard🚀 How to Run!
python gesture_game.pyThis project is open-source and available under the MIT License