I have an odd use case. I’d like to use Amethyst for audio games targetted at blind/visually-impaired players. One feature I’d like to add is speech access to Amethyst UI components. Essentially, this will involve making the UI’s event model a bit richer, intercepting UI events, and speaking some meaningful text. For instance, hovering over a button labelled OK might speak “OK: button” via an as-of-yet-unwritten text-to-speech crate.
I’d like to make this work as generic and easy to integrate as possible. It looks like the current UI generates a handful of events. I imagine most of those will be handled in states specific to the UI (I.e. a launch menu will intercept events and perform actions on those) but what I’d really like is a way to globally intercept all UI events outside of a state, then perform actions on those events in ways that don’t have any side effects. I’d also like to make it as easy and as setup-and-forget as possible. IOW, I don’t want to have to integrate accessibility into every UI state. I’d rather grab the UI event channel on startup, integrate my handler, then have every UI become automatically accessible.
Is there any way to achieve this? I don’t think systems are what I’d want since I’d then need to integrate the handler into every system. Likewise with states. Ideally I could just create an amethyst_accessibility crate that exports a resource which, when the game launches, automatically injects accessibility helpers that work across states and systems when added to the world. But I don’t know that this is currently achieveable, or what I’d need to request to make it possible.