Totally blind looking for collaborators on a project to make zynthian accessible to totally blind users

Hi everyone, hope you’re doing well and staying safe :slight_smile: my name is Trey I’m totally blind from the UK. I’m an electronic Musician, currently using a combination of hardware, synthesisers a hardware step sequencer and macOS to make electronic music in my studio. For awhile now I’ve really been interested in getting a hardware sampler that I can use to play live, all of the currently available options I’ve tried haven’t suited my needs. I came across zynthian after posting on reddit link here:
https://www.reddit.com/r/synthdiy/comments/1btjtqb/totally_blind_enquiring_about_starting_a_project/

I really think that zynthian could be a great option for myself and other totally blind users that are looking for a good groove box, and sampler option.
I have no coding experience myself, and I am unable to do any diy electronics due to being totally blind and having the use of one hand because of cerebral palsy. However, I am looking for developers from this community to collaborate with me on a project to make zynthian accessible to totally blind users via a text to speech interface, some of you may be wondering about haptic feedback via motors. This I believe would be expensive and if the motors break this could lead to other problems. Some of you may also be wondering about the use of braille in an accessible interface, Braille would be difficult to implement and not all blind. Musicians are fluent enough in bbrail to take advantage of such an interface. With all, this being said, I’m looking for collaborators to help make zynthian accessible to myself and other blind users.
If anybody is interested in collaborating, please drop me a message or comment on this thread. Thank you very much for taking the time to read this post everyone and I look forward to collaborating with you all :slight_smile:

6 Likes

Hi @soundwarrior20! A very warm welcome to our wonderful community. You raise a subject that we have been discussing for some time but have not yet implemented all the elements required to make Zynthian as accessible as we would like.

Our high-level plan is to provide a separate audio output that carries information about the user interface. This would be mixed with the main audio output for a user monitor feed, separate from the main outputs used for front of stage. It would give audible information to the user about the current state including spoken word and tones.

User input can be via various methods including the rotary encoders and switches and computer keyboard. There could be a set of commands that a user could type that drive the Zynthian.

I am keen to avoid interpreting the graphical user interface, e.g. with screen readers. We have the opportunity to implement a properly focused interface for blind and partially sighted users rather than try to emulate a user interface that is built for fully sighted users.

I also want to provide as much accessibility as possible, not just for visually impaired users but also physically impaired, etc.

FYI I am a member of the MIDI Association Special Interest Group for Accessibility.

5 Likes

hi this is interisting how far along are you with this process? what help do you need?

We did some proof of concept testing with speech generation and audio mixing. We paused because the ability to drive the Zynthian core directly required a more advanced API and separation on GUI from core functionality. Much of that was fine during the development of Oram, the next version of Zynthian but much remains to be completed. The API and separation of UI from core was not the primary aim during that development but knowing it was required we did as much as was sensible during the refactor.

I think that improved accessibility can be driven by workflows and user stories. The technology changes required to facilitate this can be done iteratively. If we wait until all the building blocks are in place it may never happen.

Even help with language can be invaluable. Some people may use inappropriate terms whilst others may be concerned about using inappropriate language which may inhibit their willingness or ability to help.

So, something that you can do to help is describe some workflows that you currently do or aspire to do with any commentary on the challenges and opportunities to implement such workflows. It can often be simple things that we overlook that cause significant issues and a developer without a disability does not necessarily comprehend the challenges nor the skills and abilities that a user with a disability may have.

2 Likes

I don’t want to repeat myself here but have written a post in the riban modular forum about improved access.

I know this post relates to improved access for blind users we want to consider all access.

There is a thread on our forum where we discussed accessibility for sight impacted users. It is quite long with a variety of people giving various ideas with varying amounts of experience. Accessibility for sight impaired users.

I wonder whether we merge these threads. @lewisalexander2020 was very vocal in that thread. Unfortunately we didn’t have the capacity to complete the work started there but we really should.

I’m still here, I just haven’t been involved further in this due to abandoning linux accessibility due to two major problems, one of which concerning a developer, the other concerning someone conning me out of money for work being undertaken and nothing in return. so I’ve been away from all this.

I’ve been working for Modartt since December 2022 to make organteq 2 accessible for blind organists and as such, I’m building a physical console from scratch which is sponsored by a number of parties, manufacturers, etc.

It would be a huge help, but also a challenge to make zynthian accessible with linux’s built in screen reader. ORCA is not as stable as some think, worse on a raspberry pi due to key mappings as an example, that’s a lot wrong that needs fixing.

all the best and yes, please, do merge this in to a suitable thread or area of the forum and keep the work going. If I can support this somehow, ok, but As I have no test environment for this, there’s not much I can do other than advise.

lew

3 Likes

Hi @lewisalexander2020!

It is good to hear from you again. It sounds like you have had some bad luck or experiences since we last chatted but it is great that you are working with our friends at Modartt. I love what they do and they are a lovely bunch of people.

As we discussed before, Zynthian’s core is not suitable for screen reader interface and we plan to improve accessibility to all with embedded solutions. The module in Zynthian that does benefit from screen reader is webconf, its web based configuration tool. I suspect there are parts of this that don’t work well for all users, e.g. there may be parts where screen readers fail or are inaccessible to keyboard or screens that have poor contrast. I believe there may be tools that can test some of this automatically and we should run webconf through such tools. Amy advice from anyone with experience in this field is appreciated.

I am interested in your physical console. I would suggest that we keep the two threads seperate now that we have a linked to the old thread. Maybe you could update the old thread with your progress on the console. I think that fits better there, being slightly off-topic (in that it isn’t directly zynthian related) but super interesting to us here.

Good luck with your various projects. Keep in touch and let us know if there is anything that we can help you with too.

Cheers!

1 Like