AI Hand Gesture and Voice Controlled Interface
AI Hand Gesture and Voice Controlled Interface
K. Esha¹, N. Bhavani², S. Hemanth³, D. Bhaskar⁴
Supervisor: B. Rajasekharam M-Tech (Ph.D), Assistant Professor, Dept. of CSE, VIET
¹Department of CSE (AIML), Visakha Institute of Engineering and Technology, Andhra Pradesh, India
2Department of CSE (AIML), Visakha Institute of Engineering and Technology, Andhra Pradesh, India
3Department of CSE (AIML), Visakha Institute of Engineering and Technology, Andhra Pradesh, India
4Department of CSE (AIML), Visakha Institute of Engineering and Technology, Andhra Pradesh, India
Abstract - This paper presents an AI Hand Gesture and Voice Controlled Interface that enables users to interact with computers through real-time hand gestures and spoken commands, eliminating dependency on traditional input devices. The system uses computer vision via OpenCV and MediaPipe for hand landmark detection, andthe SpeechRecognition library for processing voice commands. Recognized inputs are mapped to system actions using PyAutoGUI, enabling operations such as application control, media playback, scrolling, and volume adjustment. The system achieves an averagegesture recognition accuracy of 90–95% with a response time of 50 100 ms, and voice recognition accuracy of 88–93% with a 1–2 second response window. Tested under varied conditions, the system demonstrates reliable, real-time performance on standardhardware without cloud dependency, making it suitable for assistive technologies, smart environments, and contactless computing.
Key Words: Artificial Intelligence, Hand Gesture Recognition,Voice Control, Human-Computer Interaction, Computer Vision,Speech Recognition..