@storm @lorabe @danielst @devinprater @wizzwizz4 @csepp @pvagner I just donated 50 USD to https://puri.sm/fund-your-app/ to fund "Software Optimizations: Accessibility support/screen reader support with phosh/gtk4" @purism I think we need to find ways to fund a11y support in gtk4, sticking with older unsupported versions is not going to be sustainable for long term.
@storm @lorabe @danielst @devinprater @wizzwizz4 @csepp @pvagner @purism I just talked to someone at Purism and they are positive about supporting it as it aligns with their goals. They are asking me for a list of priorities. I suggested screen reader, but if you all, who needs this more than me, can create a prioritized list of accessibility features, then I can share it with them.
@praveen @storm @lorabe @danielst @devinprater @wizzwizz4 @csepp In relation to @purism #librem5 The most prominent and difficult to implement feature would be #accessibility aware touch input support. In order to be productive we need to be able to explore the screen content before activating touch controls.
@pvagner @praveen @storm @lorabe @danielst @devinprater @csepp @purism What would the UI for that be like? "Single tap reads, double tap activates"? (Would there be a clicking noise when you tap something, or does it just read straight away?)
From what I can tell, the stuff I've described wouldn't be that hard to implement, assuming a correct AT-SPI2 implementation in the application. In Firefox, you'd be able to "see through walls" (be told about things in hidden tabs) until that bug is fixed.
@wizzwizz4 @praveen @storm @lorabe @danielst @devinprater @csepp @purism Single tap / touch / hover would read what's under the finger if there is enough text / accessibility support within the underlying control. Double tap should activate. There should be also a way to assign other touch gestures to screen reader actions such as text review commands
@pvagner
Can you file an issue at https://gitlab.gnome.org/World/Phosh/phosh/ and add these details ? That will make it easy to follow up for them.
@wizzwizz4 @storm @lorabe @danielst @devinprater @csepp @purism
@pvagner @wizzwizz4 @storm @lorabe @danielst @devinprater @csepp @purism Looks https://gitlab.gnome.org/World/Phosh/phosh/-/issues/47 was closed by mistake, it is reopened now. Please add your ideas there.
@praveen @wizzwizz4 @storm @lorabe @danielst @devinprater @csepp @purism I have added ideas discussed here to that issue. Thanks for coordinating
@pvagner @praveen @wizzwizz4 @storm @lorabe@floss.social @devinprater@dragonscave.space @csepp @purism
(1/4) While Purism is overwhelmed, understaffed and underfunded, I could actually imagine that GTK4 makes a11y simpler in the long run. Why? Purism created libhandy, now libadwaita in GTK4, providing consistent, complex, advanced, themeable controls, automatically adapting whole dialogs between mobile and desktop form factors.
@pvagner @praveen @wizzwizz4 @storm @lorabe@floss.social @devinprater@dragonscave.space @csepp @purism
(2/4) libadwaita controls know about their state, e.g. settings dialog knows it's currently in the WiFi sub-dialog, even if the menu is hidden on mobile. Apps using those controls automatically benefit from all improvements there, be it default gestures or screen reader integration.
@pvagner @praveen @wizzwizz4 @storm @lorabe@floss.social @devinprater@dragonscave.space @csepp @purism
(3/4) Question: Is one-tap-read-two-click really a good approach? It implies you have to tap around a lot to find stuff. With libadwaita it should be possible to do something like "read out top level items". For gnome-settings, in desktop mode it would read "menu" and "content: WiFi", indicating that WiFi is the selected menu item.
@pvagner @praveen @wizzwizz4 @storm @lorabe@floss.social @devinprater@dragonscave.space @csepp @purism
(4/4) In the mobile view, only either menu or content are visible, starting with menu. Thus, it would instead directly read out the available items, possibly assisted by gestures, e.g. tap: stop and re-read current item, swipe up: read previous, swipe down: continue reading, swipe right: select. Then continuing by reading the top level items, either settings or groups of settings, inside of WiFi.
@danielst @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism Hover or single tap to explore and double tap to activate is a typical interaction model on IOS and android so far. I may very well be missunderstanding but what you are suggesting reads the whole screen at once and users should influence that reading.
@pvagner @praveen @wizzwizz4 @storm @lorabe@floss.social @devinprater@dragonscave.space @csepp @purism
For example, the label of an input field is not independent, but directly related.
In my example, you could also swipe down 5 times very quickly, to skip reading 5 items if you already have a clue. And of course keyboard or speech recognition could be used to directly jump to relevant items. But the structural info "this is a menu" should help no matter what is your favorite approach.
@danielst
I think continuing this discussion at https://gitlab.gnome.org/World/Phosh/phosh/-/issues/47 will be better, if any info we discussed here but missing there, then please add it.
@pvagner @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
@danielst @pvagner @praveen @storm @lorabe @devinprater @csepp @purism Yeah, it already does that; that's standard for AT-SPI2 screen-readers (i.e. Orca, the only AT-SPI2 screen-reader). This is talking about an additional behaviour for touchscreens.