You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 12, 2019. It is now read-only.
Hi, I want to have a go at implementing accessibility for the blind. I want to give voice instructions as per my issue suggestion. card-io/card.io-iOS-SDK#202 (comment)
I am just starting out so maybe this is not possible, but the plan is to:
Find where in the code the rectangles are detected.
Expose the relative positioning of the largest rectangle relative to the screen to the iOS / Android specific clients.
In the iOS client translate the relative instructions into localised voice commands.
Read out the commands using an accessibility announcement.
My questions are:
Is there anything that will make this super hard?
Is the card rectangle detected if its slightly offscreen?
Where in the code should I start out?
The text was updated successfully, but these errors were encountered:
@rsaunders100 I'm no longer close enough to the code to directly answer your questions. But we've added a Documentation section to our README file, which includes a link to a video you should find helpful.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi, I want to have a go at implementing accessibility for the blind. I want to give voice instructions as per my issue suggestion.
card-io/card.io-iOS-SDK#202 (comment)
I am just starting out so maybe this is not possible, but the plan is to:
My questions are:
The text was updated successfully, but these errors were encountered: