I would prefer we design the interface to align with touch / mouse workflows rather than just add the four buttons which does not deal with the four rotary encoders. An example of workflow based design is the mixer I am working on which allows direct control of the gain, balance and mute of the selected channel and change of selection using encoders / switches whilst allowing direct control of any channel’s gain and mute with touch / mouse, having a pop-up for balance. Each interface is then optimised for the user workflow based on available input method. (I do similar in the step sequencer.) I am worried that precious screen may be occupied by buttons that may not even be relevant to that screen. Also, there may already be a touch / mouse workflow on some screens which may make the buttons redundant. I also worry that we may need to modify code to support the buttons which is yet another overhead to code maintenance. Sorry for sounding negative but it is worth sharing concerns alongside compliments .
If we do have buttons, maybe they could be combined with the top bar somehow to reduce screen usage or only appear when relevant.