-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve accessibility of the Terminal View by making Reading Mode navigation more intuitive and add accessibility actions for the context menu #4104
base: master
Are you sure you want to change the base?
Conversation
This comment was marked as resolved.
This comment was marked as resolved.
Added a second patch which adds two Accessibility Actions to announce (= speak aloud) the cursor position, and the full line the cursor is on, respectively. As of right now, there is nothing on the Android UI side to tell you this. In the meantime, I'm continuing to explore more "intended" ways to represent a text area with a terminal style cursor, but I keep hitting roadblocks and dead ends. Hence, these two accessibility actions. (This commit also reordered the actions and changed some resource strings and IDs added in the previous commit; I can shift the diff chunks around or squash the whole branch, just let me know what works best for reviewers). |
Added a couple of new commits for extra keys (esc / end ctrl alt etc):
|
ec3ccd7
to
d6685e9
Compare
Improved reading mode navigation in `TerminalView`, allowing char/word/paragraph navigation to work intuitively (e.g. *activate* -> *paragraphs* -> *swipe up* will take you to the last line of output). Navigate by *Controls* or Touch to get out of the Terminal View. Added *Copy Screen*, *Paste* and *Termux Menu* accessibility actions on the `TerminalView`, which *TalkBack* will now report.
There's a bit of a discrepancy where the "accessibility" cursor is, and where the true terminal cursor is. So add a couple of actions to help figure out where your text will go, or what it is you've just typed. Without this, there is currently nothing on the Android UI side to tell you where it is.
…he character under cursor In the abscence of something better, "Speak Cursor Position" doing its best to read what is under the cursor in combination with "Speak Cursor Line" should give you a reasonable idea where your input will go / what you want to delete. If anyone brings up "TextEdit works, use that", be aware we're dealing with like 3-5 different cursors (the terminal's, selection handles, the "where was I reading" accessibility cursor, which may have a begining and end). If you need a line spelled out, then these two accessibility actions should help you get your "what am I saying" accessibility cursor on the right paragraph and ask it to spell out what's there.
This makes the additional soft keys behave like soft keyboard keys (if Android and TalkBack are recent enough). There is a setting where text entry keys may be activated on touch up, activated on double tap, or a mix of both. This makes the soft arrow keys easier to use, since there's less tapping.
…s (CTRL, ALT, etc)
d6685e9
to
94dc606
Compare
Rebased to latest master. Added an extension to the second commit in the series (the one which adds "Speak Cursor Position" and "Speak Cursor Line") which also speaks out a unicode description of the character under cursor in order to properly explain "where your input will go", which otherwise can be rather unclear since the emulated terminal's cursor position and the various Android cursors and accessibility cursors aren't in sync, and there's nothing to sync them right now. I had to add a Overall changes up to this point, all applicable when an accessibility service is running:
|
tl;dr it's a lot easier to read the output of the last command
and to get to the copy/paste/more... actions
Problem
It's a bit yiffy to navigate the wall of text in the terminal view. As things
stand right now with content description, accessibility services see the
TerminalView
as an opaque blob with a wall-of-text description. Since it'sa description, there is nothing locking you in the control when using reading
controls (like you would be in a
TextView
).Additionally, if I read the documentation of
setContentDescription
right,it is really meant to offer a description of non-textual elements, or of
irritatingly opaque content; not to be the content itself. This is probably
irrelevant, as the goal was to make it easier to get to the output of the
last command.
Solution
It is also nigh impossible to notice the popup menu (copy/paste/more...)
if you don't... see it. It ends up at the end of the control stack and it
is not announced (if I understand correctly, you have to issue an accessibility
focus event, but the accessibility of the selection handle thing is a whole
different problem to the one I'm trying to solve here).
The goal was to make the
TerminalView
behave more like a text view:what the system guesses, within the terminal view itself
too much (e.g. by word)
after activating the view (since it no longer "escapes")
let's you flick between the view, soft keys (CTRL, ALT etc), copy/paste/menu
popup, your soft keyboard, etc; reading controls allow you to navigate
a wall of text within a control, i.e. the terminal view
actions; note, these are apparent if TalkBack is running
Code Changes
Actual changes, on
TerminalView
:things which are important to accessibility, and it is the main focus of
the whole app. The documentation of that method asks to set the flag to
YES
if a view does any of the things below;sendAccessibilityEvent(TYPE_VIEW_TEXT_CHANGED)
when text changesand implement
onPopulateAccessibilityEvent
to let accessibility servicesknow the text has changed (and what it changed to)
onInitializeAccessibilityNodeInfo
which is called byaccessibility services to detect what is on screen; this gives us a chance
to explain that this
TerminalView
is a multiline text container and notsome random eye candy that may be skipped over
way of
AccessibilityNodeInfo
andAccessibilityEvent
; replaced withall of the above
open termux menu (the More... menu). Accessibility Actions are something
an Accessibility Service may query and present in a way that makes sense
to the user. In the case of TalkBack, it has an Actions "reading mode"
strings.xml
and added 3 new IDs toids.xml
tosupport the custom accessibility actions.
onTouchEvent
, branchBUTTON_TERTIARY
,to use it for the accessibility paste action
getText()
, checkmEmulator
exists, otherwise return blank; TalkBack might get there firstLimitations
The Copy Screen action is a compromise since
TerminalView
doesn't reporton the cursor. I'm still going through TalkBack's code trying to understand what
it actually expects for things to work right. There are actions which are hard coded
to the class being
android.widget.TextEdit
(source1,source2;
but I'm still trying to understand what's going on there and in the braille code;
also, that's for TalkBack 14, there are many versions of TalkBack in use out there).
In any case, the correct way to track the cursor between the terminal emulator's
scroll back and the accessibility view of the text is a whole separate opinionated discussion,
which could be left for later...
Something ought to be done about the side panel, since you rather have to
know it's there. But since the More... menu is now easily discoverable,
someone with TalkBack might stumble onto the Help panel and read that there's
a secret side panel on the left. That wouldn't be much different to the way
I myself discovered that sessions side panel on the left :-)
It would be worth adding landmarks for the terminalview, button bar and
side panel, but that's a different PR, since it probably needs to be done
in the Activity, not here.
There's also something iffy about the fact that
AccessibilityManager.isEnabled()
is only checked at start up. Maybe it should be documented somewhere that
accessibility is checked only at startup for performance reasons. For people
who turn TalkBack on after launching the app, it might appear broken, even
though it really isn't. I didn't want to touch that behaviour since it was
something requested in the review of !344. I am merely stating my dissent.
To add to
mAccessibilityEnabled
: the a11y related method overrides aresimply not called if there's no active accessibility service, hence no checks needed.
Someone also mused about having the terminal speak out text as it comes in,
but that's also a different PR, since it rather requires interplay between
the TerminalEmulator code and TerminalView (providing accessibility), plus
setting up a virtual view that is live; and I assume people might want to
turn that off / control verbosity, which is a lot of work, so... not in this
PR.
It is theoretically possible to do partial text updates in
onPopulateAccessibilityEvent
,(instead of blasting away the whole screen each time you type a character)
but I'm not convinced it actually works the way I hope, and it requires a
lot of work to get there (see previous paragraph). So... not in this PR.
But this PR makes interacting with the TerminalView itself much nicer
if you can't see what you're doing, and there's no reason to require people
to see what they're doing in this case.