AI – Powered Hand Tracker for SignLanguage Recognition
AI - Powered Hand Tracker for SignLanguage Recognition
P. Vani Manikyam¹, Pragada Tejaswini², Raavi Harika³, Kummaripalli Tirumalesh⁴, Surimalla
Dileep⁵, Padidapu Karthik⁶
¹Assisstant Professor, ²³⁴⁵⁶Students
¹²³⁴⁵⁶Department of Computer Science and Engineering
Visakha Institute of Engineering and Technology
ABSTRACT:Communication feels simple for most people, but for those who rely on sign language, it can quickly become difficult when others don’t understand their gestures. This project tries to reduce that gap by building an AI powered hand tracking system for sign language recognition. The system uses a webcam to capture live video. It then applies MediaPipe to detect hand landmarks, focusing only on important points like fingers and joints instead of the entire image. This makes the process lighter and more efficient. These landmarks are collected over multiple frames, because gestures change over time, and are then passed into a TensorFlow/Keras model for prediction.The model identifies the gesture and displays the result on the screen in real time. It also stores previous predictions, which makes the system feel more complete. The response is usually quick, though sometimes small delays or lighting changes can affect accuracy.The system currently supports a limited set of gestures, so it works best in controlled conditions. Still, it shows that combining hand tracking with deep learning can create a practical and usable solution. With more data and improvements, it can be extended further.Keywords:Sign Language Recognition, Hand Tracking, MediaPipe, Deep Learning, Gesture Recognition,Computer Vision, Real-Time System, TensorFlow