How much do you adjust parameters in UI?

Zynthian’s core user interface includes the ability to adjust engine parameters using the four rotary encoders. Each engine’s parameters are displayed in pages with four parameters per page. If there are cascaded effects then the pages of parameters for the effects are presented in the same list. The order of the parameters is dictated by the upstream project and is not necessarily consistent or logical within the context of Zynthian. There has been much discussion on how best to order these parameters but may I ask some more fundamental questions?

  • How often do you actuall use these pages to adjust parameters?
  • Do you adjust many parameters or just a few, maybe common ones, e.g. volume?
  • How irksome or easy do you find this part of the interface?

I am not currently looking for ideas on how to improve the UI. I want to understand how the community is currently coping with the current design. (I find it too difficult to find and adjust the parameters in a meaningful or useful way.) A better understanding of how people are currently using the UI and why may inform future changes.

I am particularly interested in how heavy users and also system architects (@jofemodo - that is you!) get on as they must have some strong opinion but please, everyone let me know your opinion.

1 Like

If I am changing a parameter more than once, it is mapped to a controller.

I use TAL NoiseMaker more than most other synths or plugins, which has waaaaaay too many pages to make it fun to use the default interface. As a result, I haven’t really explored its full potential, as I find flipping through the pages to find the correct parameter EXCEEDINGLY frustrating when I’ve got an idea I’m trying to get out.

Irrespective of the synth or filter I’m using, I’ve almost always got filter cutoff, resonance, and volume mapped to the same encoders on my controller, and other parameters depending.

I hate mapping MIDI controllers too tho, and would love to see the ability to auto-map synths and plugins to a predetermined values - even if the Zynth simply remembered the values from the last time an instance of that plugin was used.

1 Like

There is a certain number of parameter I am willing to deal with and fluidsynth get’s it about right. (3 pages). I ALWAYS want overall engine volume in the same place because that is the control that gets a very high degree of attention, Hunting down which one of the five or six ADSR sets in some of the more involved engines to know quite how to soften the attack or lengthen a sustain is not something I tend to go for ( althou in truth the amount of actual sound making has been in second place behind implementation recently), and once you have adjusted a parameter away from default it’s difficult to know if you are dealing with your subtlety edited master work or the bog standard default setting. . which is worthy of note.

Given the engines can in some cases be a little bit zippy as you change parameters I dont consider these as performance control in the full on vcs3 style but if someone is operating that way then I’d love to hear their experiences.
The Alpha Juno 2 with it’s alpha dial did feel the need to pull a couple of parameters out on to the front panel (brilliance I think was one) and the cs80 turned the limitations into a real singular feature by making the process as visibly obvious and tactile as they could,

A favourites ( ie Iv’e altered this one therefore it’s important type approach ) might help unify some of the problems but it’s the old list/order problem presented another way. I 've used colured wedges in interfaces to differentiate different sort methods (age, name, type, no of times touched, category…). I suspect there are probably as many sort orders and parameter filter modes as there are performers but I suspect that they wouldn’t change their overall approach once they settled to it which would make it very personalized, and play nicely into the rule of least surprise once a user had put enough time in to it to develop an approach.
The methodical filer of piano sounds is not the same beast as the whanger of filters but both could probably be accommodated because the overall approach in zynthian is pretty generic.

1 Like

This.

1 Like

I’ve some ideas about implementing a kind of “favorites” mechanism for control parameters, so you can easily mark the parameters are useful for you and get a simplified list of them.

Also, having the possibility of easily sharing favorites, snapshots, sequencer patterns, etc. could be really interesting, but first, we should implement some kind of register mechanism, perhaps based on the forum’s identity. There is a python library that allows connecting with discourse API…

Regards,

6 Likes

Hi All! I use my Zynthian at rehearsals so I don’t have to carry a lot of gear. My setups are Linuxsampler samples of my piano, ep, and organs. I sometimes use OBx-d for Pads, Brass and Lead. My most common controller adjustments are:

Volume
Octave
Split
LFO Speed
Filter cutoff
Filter Envelope Amount
Attack
Release
Chorus or Leslie Speed
Delay Time
Delay Feedback
Reverb level

I adjust volume, octave, split and effects using the Zynthian.

OBx-d is also mapped using the Zynthian.

I use the screen to make adjustments, then I map one or 2 parameters, i.e., filter cutoff or Leslie speed to a knob on a keyboard.

(An OBx-d controller map file would be great here…)

I code the other controls directly into Linuxsampler sfz files.

The USB port of my nanoKontrol broke, so I had to buy a new knob controller. Needless to say, but I use OBx-d a bit less now…

I am patiently waiting for Zynthian kits to re-stock, as the touch screen and an uncased Pi 4 may be to fragile to toss into my carry bag.

I have been very happy with the Linuxsampler implementation in the Zynthian!

Sam in NJ USA

1 Like

Which Sequential module is that? :thinking:

Hi @samriccijr!

FYI, zynthian kits are available again on the shop :wink:

Regards,

1 Like

P6.

I have not yet mapped the knobs to OBx-d. The knobs transmit nrpn and cc. I think the lfo knobs send only nrpn. I might have to map the sub osc and distortion knobs to lfo parameters on the OB X-d.

This leads to two trouble spots for me on the Zynthian:

1: Layers in Stage mode. I have trouble cloning Midi 2 OB Cloud Strings to Midi 1 LS piano. It is easier to layer thru a P6 pad to a Zynthian piano.

The other workaround is to add a pad layer to a piano sfz in Linuxsampler. This is relatively easy.

2: Splits in Stage mode. If I want a left hand piano up 2 octaves to Middle C and a right hand OB Lead down one octave on the top keys, I never get the clone to work.

One workaround is to use a left hand piano sfz up to C4 and use the P6 on the top keys.

The other workaround is to merge a LS file with left hand piano to a LS file with right hand lead. This requires alot of editing.

I use Stage mode because most keyboards send program changes; I can load LS piano on Midi 1, LS Dx on Midi 2, LS Organ on Midi 3; LS preloads them and they play nicely; and it is easy to assign program changes that just work.

Maybe I am missing something with the splits. I hope this helps!

Sam

I would like to avoid taking about solutions yet and garner user experience. There are improvements to the current UI that may be implemented but it will also be useful to hear how users are actually using the devices and why. A blinkered horse tends to run in the same direction, missing opportunities for a better path!

Keep the feedback coming. This is useful.

2 Likes

distant sound of a raven flying into a tree and swearing profusely . . .

P6, that surely looks (and must sound even more) awesome!
But let’s not get off-topic :smile:

For me the current workflow is:

If I use a synth a lot, I map most if not all of it’s parameters to an external midi controller, which becomes painful and almost always means that I’m only using the Zynthian with one synth at a time, since I store said mappings as snapshots. I’ll not go into further detail as I do think my approaches on this topic from different angles are well known :wink: :wink:

I wouldn’t worry, I can speak with some authority about banging on about something and it producing results.

A lot of this stuff is careful definition of objects in an environment, and the particular rendering that makes it to the eyeball or ear or haptic pad is merely that,

It’s one moment in time you are observing and from there we move on to actions. We recognise an entity we wish to alter, and perform actions to move that to the next event in time., and so on.
It’s a history and it needs storing in a repository as any good coder kno’s . . .

I we don’t get the object model of our moment in time correct or incoherent then it wont interact with other object concepts, like engines or layers, and our model becomes indistinct.

So suggest away. It how it seems to work. I constantly restate positions to refine them for myself and it also, one hopes, improves one’s style. Reacting to someone’s theme, be it repeated or spontaneous, is how ideas form in these sort of process so every view is valid and reviewed by the audience!

Bit like musical performance really!

We need performance interface disaster stories, and if they have . . .

Course the first person to present a :face_with_monocle: of an interface generated disaster will earn a place in the records . . .

My first,

lack of a stop all on a MIDI controlled lighting rig for a gazebo lead to ambient environment taking on the nature of a rave . . .

Ok, if I get this right, you’re asking for use cases?

I’m using a Novation LaunchpadX to control paramters both, in MOD-Ui and the user interface. Even works with the same CC simultaneously.

The LPX has a custom mode where you can define buttons and sliders. The cool thing about the sliders is, they are touch sensitive. When you hit the target value hard, the value is reached quickly. If you press soft, it gradualy climbs to that value - perfect for transitions.

There is an app from Novtation, which spits out sysex files for the custom modes.

pretty much. What characteristics of our GUI and overall interface will be most useful in live situations?

What characteristics of devices are both useful and sub optimal in a live situation?

This appears to be the tighter set of use cases than that provided by configuration, setup & modification in a studio environment.

Obviously both positive and negative experiences are of interest. . .

For instance here is a performance that did not go well . . .

The former description is pretty much a best use case for me. The controller is lit, so even in complete darkness I can see exactly where my parameters are. Kinda foolproof (and I need that one, believe you me).

I’ve been using the infamous V-Machine before the Zynthian. There I used the Android App Midi-Commander, which you customize a fair bit.

And I think that seems to me the most important aspect of a control surface (be it physical or screenical): customization. For that reason I really never touch the Zynthian for anything other than the occasional preset change.

EDIT
Oh, forgot the equal important thing: visual feedback!

I’d like use UI as much as possible.

I generally find a limit because my favorite parameters are sometime far away in the list with not so obvious name and classification. I sometime regret that there is no equivalent to “push select” to scroll up parameters (when I misteakenly scrolled down 1 step more than expected). “Bold select” alows to use rotary encoder to scroll up and down but it requires 2 actions…

Of course, 80% of the time, I just need to manage volume, mute, pan,… for severall layer. Having these basic parameters centralized somewhere (like a mixer) could be a great help.

3 Likes

Hello !
Really glad you are considering any idea that could come up from any user !
When I first discovered Zynthian I thought it would be the perfect expander so I started to use it without encoders as a jam/groovebox, hooked to a BeatStep Pro :
1 drum rack and 2 synths 16 encoders mapped to the most often used controls Cutoff, Res, LFO amount/rate, delay freq shelf, volume and PAN (as I generally use the PAN knob as a send/return loop, the left audio channel as dry sound and feed the right audio channel thru external FX… kind a 2 mono groups
I faced some troubles with stuck MIDI notes and glitches so I forgot my zynthian for a while to check LMMS with a Open Stage Control MIDI layer to get easy access to the most used controls (but eventually dropped this idea because the mapping isn’t properly store and has to be re-mapped each time the software is launched and the tactile screen was definitely too small)
Came back to Zynthian a year later to make an update “to see”…
WHAT ??? this thing now has an embedded sequencer… salvaged 4 encoders and a phone powerbank and it is now my main drafting machine (when cooking, chilling in the garden, taking a dump, having a break at work…).
I get frustrated sometimes when i have to switch from cutoff (page 2) to LFO settings (page 8) to Reverb to delay (which is reeeeaaaally down bellow !), I sometimes get confused since I don’t know if I’m controlling the right layer.
I got to a point where I have the bass layer on the cutoff page, the lead on delay, pads on reverb and chords on LFO
I’m really thinking of adding a few encoders and some buttons to have easy access to the same parameters depending on what layer is selected and a general page with macros and layers volume.
I definitely thing a few buttons to have an easy SHIFT / ALT function wound be a good start, instant access to the sequencer or user OpenStageControl menu/pads auto-mapping to the encoders… I find hotkeys and shortcuts more convenient than menu diving ,E.G. the bold press on back button to go back to the main menu, then select sequencer and then selecting the corresponding pattern is good but takes quite a long time, and I never use the bold select option I prefer banging furiously that switch until i am page 23… or was it 19 ? the touch screen is still my sheppard when I get lost !
Anymway you guys ROCK !

2 Likes