Python

This Python application utilizes real-time facial tracking to control music volume based on facial gestures. Specifically:

  • Volume increases as the user’s mouth opens wider.

  • Music stops when the eyes are closed.

  • The interface displays a live webcam feed with facial landmarks, volume percentage, and a vertical volume bar.

  • Uses MediaPipe for face mesh detection, PyQt5 for GUI, and OpenCV for image processing.

  • Compatible with macOS (using osascript for volume control). Windows may require additional setup.

This project is intended for educational and experimental purposes. In real-world applications, it can be extended to:

  • Monitor eye state to detect drowsiness.

  • Track mouth opening frequency — which can be useful in areas such as:

    • Automatic warning systems

    • Statistics on students sleeping or talking during class

    • Monitoring focus in training or workplace environments

This application shows potential for developing contactless interactive systems, especially in education, healthcare, and occupational safety.