Effect Chain implementation

Hi!
I just upgraded my zynthian from DAC+ to DAC+ADC (replaced hifiberry, soldered 40 pin connector, drilled additional holes in the case) and now I have a quick question regarding the implementation of the effect chain.

Is it possible to create an empty effect chain that just routes audio-in directly towards audio-out? Or could an empty (dummy) effect be auto-loaded?
I find it bit of counterintuitive that I either get no audio-through (i.e. when there is no effect layer) or that I only get the processed audio routed through the FX chain.
Is it supposed to be working like that or am I missing something?

I imagine it could be more intuitive if, when starting a new effect layer, it automatically connects audio-in to audio-out. Any additional effects that are added by the user would then get placed inbetween this connection.

Related to this is also the function of move upchain / move downchain actions. Currently, they only appear as an option if there are three or more effects. This is unexpected, I was expecting to have these options already when there are two entries in the effect chain (here it would basically swap effects.).
Also, replace effect works only from the second entry onwards, but not for the first effect (probably because this one serves as the anchor of the chain).

So, if the first entry would be automatically just the direct connection from audio-in to audio-out, it would make sense again to have move upchain / downchain from third entry on.

Additional benefit - if one starts with an empty effect chain with directly routed audio from in to out, it would be easier to replace effects without having to destroy the effect chain over and over again (in case one has only a single effect in the chain).

Please say if I’m just misunderstanding some concept here.
Or would it actually make sense to change the implementation here a little bit? :thinking:

regards

1 Like

It’s kind of been a feature, if you wanted to use audio in, which is in zynthian terms a fairly recent development, in the early days you were restricted to the Audio Injector card which was not part of the main hardware kit trunk and a certain amount of alsamixer manipulation, so the mechanism was you needed to add something in the effects chain to get it to fly, and the small group of users were pleased that it worked…

The sudden interest in effect patching and routing has, as with any successful piece of development thrown up all kinds of issues, and perhaps we are progressing a little too fast in some areas ( yes visuals are great fun and I for one have played a lot with them) but the audio routing is something we REALLY need to get right, if we are to make it intuitive and simple which has felt like the primary drive of the GUI.

The problem is probably visualization. To only be able to grasp the complexities of what goes where I think we do need some form of visual representation of the system to make this all clear. Patchage certainly gets you a lot of the way but it doesn’t really fit with our intuitive and simple approach.

1/ Take a standard Roadie,
2/ Soak said roadie in beer & substances . . .
3/ Leave for three hours, pre gig to simmer
4/ Place roadies at side of stage with dry ice machine
5/ Launch into startling intro ( all preprepared)
6/ Move into tender ballard about the drummers 67th girlfriend . . .
7/ Realise somethings not quite right in middle of atmospheric keyboard solo with both hands
8/ Glare at roadie making what’s wrong type gestures with heavy frowning . . .
9/ Expect roadie to:
a) have laptop
b) have network connection
c) have Putty loaded
d) have password
e) have Linux experience
f) have patchage
g) have understanding of patchage

Well you get the idea…
So to address how we orgainse jack audio routing I suggest we should first consider how we want to visualize i.
Can we squeeze it onto the GUI, given somebody will REALLY use it, way beyond our wildest dreams?
Put it on the Webconf, requiring a separate access device?
Put it onto generally unused HDMI connector?
A completely separate OSC driven app ?

and once we have that what should it look like? MIDI & Audio separate, visualization of mono/stereo…, colours? names …? linked to parameter menu from touchscreen?

Even if we don’t address all that I think it will help us define what we do and don’t want in this area, with the same degree of attention we have turned to other areas …

As it’s nearly one o’clock and time for lunch, I am off to cut the lawn . . . :smiley:

Until we get all that implemented, you might just use Gain2x2 as your effect layer.

Yes I probably should have said that in the middle of my epic … :smiley:

2 Likes

I use a Zynthian with USB soundcard as Guitar soft-amp, and it works quite nicely. I indeed throw a Gain in as first effect, after which I can reorder the rest of the chain fairly flexibly. The mono/stereo thing is mostly worked out as well.

Ok good hint with the gain2x2 plugin :slight_smile:
@wyleu I wasn’t aware of all the complexity. Now we appreciate it all even more what’s been achieved so far and I think the FX capability is where zynthian can really play off its strengths in the future! :grinning:

Just my 5c: The kind of patchbay matrix view that Ardour uses should have at least a prayer of a chance at both fitting to the Zynthian screen and being usable with its native controls. In webconf for sure.

The matrix-view is of course not as visual as the patchage-style, but at least I find it far more practical to actually use, doesn’t require enormous screen for handling complexity, and should be far easier to implement since it completely lacks the drawing complexities of the “cable-view”.

1 Like

We already have a way of routing with the effects layers and adding effects to layers, don’t we? We just need an effects layer that can automatically connect to audio in.

Edit, we can change the audio routing out so perhaps we just need to add audio routing in?