Skip to content

asterics/gesture_mouse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gesture Mouse

A program that allows to control the mouse and keyboard with head movement and facial gestures. This project is based on the google mediapipe library (https://ai.google.dev/edge/mediapipe/solutions/guide).

Experimental

This software is experimental and not yet suitable for professional use. If there is a bug, please file an issue.

Installation instructions

Tested with Python 3.10 and PySide6.4.3.

  1. Clone repository
  2. Execute the following commands in the repository folder:
pip install poetry
poetry config virtualenvs.in-project true
poetry install --no-root

Running Gesture Mouse

Release

  1. Extract zip file
  2. Run gesture-mouse.exe (Windows) or ./gesture-mouse (Linux)

Development

poetry run python gui.py

Hotkeys

  • CtrlAlt+v: Start/Stop video and tracking
  • CtrlAlt+g: Enable/Disable gestures
  • CtrlAlt+m: Enable/Disable mouse movement
  • CtrlAlt+e: Enable/Disable gestures and mouse movement
  • ShiftAlt+m: Change mouse movement mode
  • ShiftAlt+r: Change mouse tracking mode
  • ShiftAlt+c: Center mouse
  • ShiftAlt+s: Switch primary screen for mouse movement

Creating a release

Use the github action to create a deployment file for each platform, see .github/workflows/pyinstaller_windows.yml. You can also optionally create a release on github with the deployment file attached.

Algorithms

The gesture calculation (e.g. eye-blink) uses the mediapipe facial landmark detection in combination with a modified eye aspect ratio (EAR) algorithm. The EAR algorithm helps to make the gesture invariant to head movements or rotations.

Calculation of eye aspect algorithm

Links and Credits

The work for GestureMouse has been accomplished at the UAS Technikum Wien in course of the R&D-projects WBT (MA23 project 26-02) and Inclusion International (MA23 project 33-02), which has been supported by the City of Vienna.

Have a look at the AsTeRICS Foundation homepage and our other Open Source AT projects:

  • AsTeRICS: AsTeRICS framework homepage, AsTeRICS framework GitHub: The AsTeRICS framework provides a much higher flexibility for building assistive solutions. The FLipMouse is also AsTeRICS compatible, so it is possible to use the raw input data for a different assistive solution.

  • FABI: FABI: Flexible Assistive Button Interface GitHub: The Flexible Assistive Button Interface (FABI) provides basically the same control methods (mouse, clicking, keyboard,...), but the input is limited to simple buttons. Therefore, this interface is at a very low price (if you buy the Arduino Pro Micro from China, it can be under 5$).

  • FLipMouse: The FLipMouse controller: a highly sensitive finger-/lip-controller for computers and mobile devices with minimal muscle movement.

  • FLipPad: The FLipPad controller: a flexible touchpad for controlling computers and mobile devices with minimal muscle movement.

  • AsTeRICS Grid: Asterics Grid AAC Web-App: an open source, cross plattform communicator / talker for Augmented and Alternative Communication (AAC).

Support us

Please support the development by donating to the AsTeRICS Foundation: