This project demonstrates a hand gesture-controlled system that allows users to interact with their computer using hand signals. The system leverages computer vision and machine learning techniques to recognize and translate hand gestures into mouse actions, providing an intuitive and innovative way of controlling a computer.
- Mouse Actions:
- Left Click
- Right Click
- Double Click
- Drag and Drop
- System Controls:
- Adjust system brightness
- Control system volume
- File Selection:
- Select multiple files
- Mouse Movement:
- Hover the mouse pointer to the desired location based on hand gestures.
- Programming language: Python
- Libraries:
- OpenCV: For image processing and video capture.
- NumPy: For numerical computations.
- Mediapipe: For hand landmark detection and tracking.
- PyAutoGUI: To simulate mouse and keyboard actions.
- Math: For geometric calculations.