Active Development

Control interfaces with hand gestures

A machine learning system for gesture-controlled interaction with holographic interfaces. Uses MediaPipe for hand tracking and TensorFlow for real-time gesture recognition.

Project Status:
Hand Tracking
Gesture Data Collection
Machine Learning Model
Real-time Gesture Prediction
Holographic Interface Integration
Terminal
$ git clone https://github.com/Neeraj-x0/gestureControl.git
$ cd gestureControl && pip install -r requirements.txt
$ python main.py
✋ Gesture tracking initialized...

Capabilities

Advanced hand tracking and gesture recognition for next-gen interfaces.

3D Hand Landmark Tracking

Real-time tracking of 21 hand landmarks with precise x, y, and z coordinates using MediaPipe.

Neural Network Recognition

TensorFlow/Keras-based gesture classification with automated training pipeline.

Multi-Hand Support

Simultaneous tracking and recognition for multiple hands in the frame.

Depth Visualization

Color-coded depth feedback for intuitive understanding of hand position in 3D space.

Real-time Prediction

Low-latency gesture recognition for seamless interactive experiences.

Data Recording

Built-in capability to record and save gesture data for model training.

Tools & Technologies

Built with industry-standard ML and computer vision tools.

MediaPipeHand tracking library
Complete
TensorFlow/KerasML framework
Complete
OpenCVVideo processing
Complete
Unity 3DHolographic apps
Planned
Leap MotionAdvanced tracking
Planned

Requirements

What you need to get started.

Python 3.8 or higher
Webcam or camera device
MediaPipe
TensorFlow / Keras
OpenCV
NumPy

Installation

Get the gesture recognition system running locally.

1

Clone the repository

git clone https://github.com/Neeraj-x0/gestureControl.git
2

Navigate to directory

cd gestureControl
3

Install dependencies

pip install -r requirements.txt
4

Run the application

python main.py

Vision

This project aspires to evolve into a sophisticated system where users can intuitively control and manipulate digital objects using natural gestures, fully integrating with holographic environments to create an unparalleled interactive experience.

Next Steps: Expanding the gesture dataset, integrating with Unity 3D for holographic prototypes, and implementing multi-gesture sequence recognition.

Contribute to the Future

Help build the next generation of gesture-controlled interfaces.