Do they take sugar..?

  • How can electronic musical instruments like Zynthian and riban modular be more accessible to all users including those with impaired sight, reduced mobility, etc. This includes temporary reduction in access, e.g. broken arm.

A good question and worth it’s own thread.

As with the device definitions, there are far more methods of interaction out there than we ever did realise.

We are attempting to refine the actions within the GUI environment to a specific set that allows movement between our stateful bits and pieces and we are looking at the extra grammar sooperlooper the zynmix and zynpad is introducing to this area.
From the inconvenienced users point of view they need a clearly defined route to all functionality offered which is chosen by a minimal set of events. Getting down to 4 encoders and 4 switches certainly helped refine this and in true zynth fashion v5 has opened it all up again!

We are lucky that it’s possible to use our device in high pressure ( live with audience) situations and as such it’s possibly beneficial to consider the absolute essentials whilst we still have a GUI that is responding. . . .
Again we have trimmed with all down pretty tightly.

Quite what gets used as the controlling element is down to the inconvenienced individual. My grandfather survived polio with a small movement in one hand and a moveable head which he used to paint. He had the time…

I like to think he could have operated a zynth and it’s a good starting point.

The other side is confirming the action has been received, is being acted on and ( if relevant) completed.
Once again quite how this info gets back to the user is very contextual. But our status display certainly seems to be the basic starting point. The green heart embodies this for me, althou’ I know it can be seen as a callous waste of GUI territory.

The idea of a zynthian being used as the audio event feedback for it’s own gui has a rather surreal meta feel that might actually unify the whole area of conscious feedback with mapped context in one chain . . . .

Ding Dong!

We talked about various methods of improving accessibility.

For users with reduced or no sight we may add audio feeback, e.g. speech mixed in headphone feed that describes status. We could also provide a computer keyboard driven CLI to allow the device to be driven from keyboard commands without reference to the GUI. We are close to being able to create a completely headless implementation (no GUI) and such a CLI with spoken feedback could be a boon for visually impared users.

For users with reduced mobility, remote control devices such as handheld controllers may be advantageous. Users with more severely constrained mobiltiy may need to use other technology to drive it but that may be user dependant, i.e. each user may have different control technology available to them. The CLI mentioned above might facilitate access or direct driving of an API.

Use of larger icons within the GUI may assist with users who have reduced vision or conditions such as dyslexia. A more visual interface can assist many users including those who want a more vivid indication of state.

Users can already change colour (which can help with colour blindness and those that struggle with low contrast) and text size.

1 Like

A slightly different take on a synth interface…
Possibly more accessible.

2 Likes