An open-sourced project built at TAMUHack 2020 with the goal of assisting visually impaired individuals in their daily life.
This project uses the Google Cloud API (speech <-> text) and Tensorflow 2. Built completely in Python3, this project can run on a CPU with minimal lag.
Using this tool, visually impaired people can use their hands as a substitute for their eyes, and vocally inform the system what they are looking for. The audio pipeline will then emit a sequence of beeps with a frequency corresponding to the distance between the individuals hand and the desired object.
Items used in the demonstration are: empty plastic bottle, open macbook lid, iPhone,