I notice this so much in companies, big companies, who continuously have to justify their accessibility teams. "A screen reader isn't just for blind people, people who are outside in the sun and can't see the screen clearly use this!" Maybe, but that's not its main use case. That's basically sidelining blind people. Making our disability not even count. But it does. I have to live with it every minute of every day.

@devinprater When they put people who have, maybe, a 15-minute disability over someone who has been disabled all their lives, it's a slap in the face to all actually disabled people. And then they hire non-disabled people for customer support, and they ask for a video showing a Braille issue? As if the words of a disabled person aren't enough, and a video showing a bunch of dots moving around will be any use to a sighted person. That's Google that I'm talking about specifically with that.

@devinprater So yeah I mean big companies need to learn to check their privalidge. But, big FOSS projects do too. I've pointed out the accessibility issues in Logseq, Mesh Central, and quite a few different projects that I either rely on, should be using, or that my boss uses so it'd be so much easier if I could use them too. But no. Why listen to a lone user with weird needs that no one else needs?

Follow

@devinprater If you got some edge-case report from a sighted person that your app isn't working with their size of monitors, would you flat out ignore them? No, you'd probably check and make sure your app scales to that size. Simple crap like that. What? You regularly test to make sure your app scales? They why the cat don't you regularly check accessibility?

@devinprater@devin.masto.host Because checking accessibility is way harder than quickly resizing a window.

Also, implementing is even harder.

Case in point: my current game has no accessibility support other than being definitely color blind friendly by design. Being a platforming game, vision is certainly required and that is not negotiable. However screen reader support could still be great - e.g. would help people who can understand but not read the text (e.g. young children). Too bad existing screen reader APIs really wouldn't work well for it.

At the very least though I added visual cues to parts that had relied on audio. Should have thought of that earlier though...

@devinprater@devin.masto.host Yeah, in games it is extra tricky. One person's accessibility feature is another person's cheat.

It then helps to decide what the game is about (and thus cannot be removed), and which sources of difficulty are intentional.

@divVerent @devinprater What kind of screen-reader API do you think would be useful for your game?

@wizzwizz4@fosstodon.org @devinprater@devin.masto.host I would like two things:

- Events that can trigger TTS in game. E.g. when picking up an item or "reading" a "note".
- Clicking objects in the game (or wall textures) could read them out.

Basically, if I just had a function to say a string out loud, with a good voice etc., I would be happy. But it should honor accessibility settings, i.e. not be active for everyone but only those who want it.

Oh, and it should, while saying the string, turn down volume of the game, and somehow handle too many events coming in.

And I would want it cross platform and not just Android exclusive.

@divVerent @devinprater What would be the proper way of handling too many events? Would a priority queue type thing make sense, where low-priority events are discarded?

What about something that required your program's explicit cooperation (e.g. the "push event" function returned a handle, or an error value when the buffer was full, and you have to use the handles to cancel pending events to make room)?

What do you think of SAPI for this purpose? docs.microsoft.com/en-us/previ

@wizzwizz4@fosstodon.org @devinprater@devin.masto.host My game is cross-platform and written in Go, so SAPI, which is Microsoft-specific, is no option.

@wizzwizz4@fosstodon.org @devinprater@devin.masto.host Having said that, priority queue is nice but not really needed - if I can just issue utterances and monitor if they're still ongoing, I can just build my own queue around it.

I'd generally have two kinds of events: wall writing, and notes. I want a note to be played always after whatever is currently played finishes, while wall writings are OK to be skipped if there's something else ongoing. In other words, I only need current and next utterance, and for the next one, notes have priority over wall writing.

As for what triggers them: notes would be triggered by walking on them in game, however wall writing would be triggered by walking over them too (with above queue-ish behavior)
or by touching them with the finger (which should cancel ongoing utterances and force it right away). The game itself doesn't use touch input except for a game controller overlay, so touching on-screen text seems useful to have it read out.

This does mean I'd need:

- Issue utterance
- Get notified when utterance is done (polling is OK)
- Cancel utterance (if possible, without clicking noise, but let at least the word or syllable finish first)
- I need two voices for English for the MVP, i18n support would need the same later

@wizzwizz4@fosstodon.org @devinprater@devin.masto.host SAPI also isn't integratable very well as I do not own the event loop (the Ebitengine library does), so I can't handle the WM_USER messages.

It
might be usable by polling though.

@devinprater I asked them directly many times, many places. It’s because the number of (dollars paid by) blind users is not material enough to change our priorities. I.e. why would they diminish the immediate comfort of sighted people so that blind people are able to participate? They don’t care because blind people can’t force them with buying power.

@devinprater As a hobbyist FOSS Android dev I personally ran into issues with documentation on how Android Talkback works being very minimal and unclear.

When the Blind Android Users community over on blindandroidusers.com/ made a video showcasing Raise To Answer I was shocked at how difficult it seemed to use. I joined their Telegram and they helped me understand how to improve and fix things. I've learned a lot since.

So, I think many developers sadly still need education on how to do this.

@devinprater That's not to say it's your responsibility to teach people, it is not. And companies the size of Google have no excuse at all.

I'll try to keep helping explain the things I learned to others but if you have any specific resources you find really good I would love to know. As a sighted person it would be ridiculous to think that I would understand the best design patterns better than a blind person.

I will just keep doing my best for my apps, that's all I can do :)

@SylvieLorxu That's all I would *want* you to do, just consider us. I'm always looking for good accessibility material to share. But you're right, Google has no excuse for bad dev documentation for accessibility.

Sign in to participate in the conversation
Light space

A home where one can be themselves.