I've created a custom rich text editor using Core Text and conforming to the UITextInput protocol. I'm trying to figure out how to make it accessible to VoiceOver.
From my reading of the Accessibility Programming Guide and the UIAccessibility protocol and associated classes, it looks like I should make my custom view conform to the UIAccessibilityContainer protocol and then create UIAccessibilityElements for the text. My question is - what level of text granularity do I return? Should a UIAccessibilityElement represent a line, a word, a character, or something else?
Playing with the Notes app, it lets you select lines, words, characters, and more. UIAccessibilityContainer/UIAccessibilityElement seem to have no way to convey this information. So, I feel like I'm missing something. Any suggestions or pointers are very much appreciated.
preguntado el 08 de noviembre de 11 a las 18:11
To make the content readable like the Notes app has - you must also implement UIAccessibilityReadingContent Protocol for your custom view. This is a reference iOSUIAccessibilityReadingContentReference For more details try to find the WWDC 2011 iOS Accessibility video.