The "Open Eyes" project aims to develop an app that can provide blind people with a better understanding of the surrounding objects and characters and further improve the inconvenience of blind people in their daily lives through object and document recognition functions using Apple Watch.
In particular, by introducing braille notifications through vibration, it plays a role in helping blind people acquire short information in braille outside through smart watches. To compensate for the limitations of Braille notification through vibration, Braille and Voice-over functions are integrated to deliver external information to users.
The core technologies of this project are OCR and Summarization, Object Detection, and Braille Implementation and TTS Function.
When a document such as a notice or medicine bottle is taken, the picture is converted into text. Thereafter, the original text may be simply summarized using the document summary technology to inform the user of the document's information.
When the text of the document is long, the information of the document may be simply notified to the user through the document summary.
It helps the user to understand the surrounding situation through object recognition through the camera. Through this, the visually impaired can respond smoothly to external situations by identifying obstacles or the number of people around them.
Braille implementation through Apple Watch ensures better privacy than the voice reading function in the process of visually impaired obtaining information from mobile phones by representing braille through haptic and vibration.
Through the TTS function, it supplements the limitations of providing information through braille. The information is delivered more clearly to the user through the function.
Technology | |
---|---|
Application | |
Backend | |
Model | |
Others |
Name | Role |
---|---|
👩🏻💻김지우 | Summarization |
👩🏻💻박정연 | Object Detection, Backend |
🧑🏻💻서정덕 | Application |
👩🏻💻신수인 | OCR, Backend |