This has been the case for a long while but we should try to fix it. Maybe someone wants to raise a ticket?
This is a different approach. The ideal way to do this with a LV2 is to create a graphics widget derived from zynthian_widget_base and use any output available from the LV2. Unfortunately, the plugin does not seem to issue any monitoring output. You could make a feature requeste upstream for them to add this output. One or more monitoring outputs that issue info about the chord detection may allow a widget to be built and displayed within the zynthian UI. Whilst there, you could ask why this is presented as a synth with audio inputs and outputs? What are they used for?
Hi @riban , I could log/request these things upstream but feel I would be filtering the message and adding noise. Would you mind contacting the project yourself?
The plugin takes MIDI input and converts to text. As far as I can see it does nothing with audio and as such it makes no sense it being in an audio chain or being a synth instrument. They made the change to synth type recently in a bid to make it work with some DAWs but that seems rather odd. If it had MIDI output for pass-through then it would make more sense as it becomes a MIDI processor (which is what it is) but it has audio in and out that are not used but no MIDI output so it canāt be inserted in a MIDI chain. I have notreported this upstream.
I mean, if it hadnāt audio in/out it would maybe silent a track used in series, for people who donāt feel like creating a midi chord detection send bus.
Itās actually not that important. Just thought about your typical desktop DAW workflow, insert a nice Kontakt piano first, which sends also the received Midi data, then the chord detector plugin which receives midi and passes audio through that it doesnāt get silenced?
I donāt know how VST works or whether it supports MIDI only plugins. LV2 does and we have lots of them which we call āMIDI Processorsā. This plugin does nothing with audio and depends soley on MIDI input. It is plausible that to make it work in a VST host it needs audio in and out⦠I donāt know. It would seem a shame if VST has such a limitation.
I talk here of VST because it is common to build pugins for multiple formats, often using JUCE as the toolkit and they are often primarily targetted at VST / AU. LV2 is often an after thought.
Absolutely, this would be great! Iād also like to have a toggle action for the overlay, so that you could easily display / hide it by button / key press. I can imagine numerous use cases for such functionality.
Can you please give a simple example of this to get us started? Or a hint to an already working one? I would very much like to learn how to make simple UI displays. More than once I have had need for such a thing.
Look at /zynthian/zynthian-ui/zyngui/zynthian_widget_tunaone.py which is the GUI widget for the tuner plugin⦠but be aware that nothing will work unitl upstream has implemented monitors.
You have a bunch to learn from. The more simple examples, good to start:
zynthian_widget_aidax.py
zynthian_widget_GxGraphicEQ.py
zynthian_widget_nam.py => PERFECT AS A āHELLO WORDā TEMPLATE!
zynthian_widget_tunaone.py
zynthian_widget_tempo.py
zynthian_widget_spectr30.py
And some more complex examples, to go deeper:
zynthian_widget_inet_radio.py
zynthian_widget_organelle.py
zynthian_widget_sooperlooper.py
zynthian_widget_audioplayer.py
The widgets above are for specific plugins/engines. Some others are bond to specific sets of LV2 parameters, like:
zynthian_widget_audio_file.py
zynthian_widget_envelope.py
zynthian_widget_filter.py
If you want a suggestion, it would be nice to have a parametric EQ widget. The x42-eq is the best candidate i know and it has the needed monitors in place Indeed, you can take the plot maths from the native GUI source code. I checked some time ago and everything is there ā¦
I think I can figure out how the widget works, but how do I load and/or call it?
Can I create a widget from a ctrldev driver? There I would welcome some user interaction.
I guess trying to build an LV2 plugin would be much harder than the widget itself.
You see, Iām a complete noob when it comes to UI in zynthian. Drivers and widgets are fine but I think I need a hello world example for the thing that calls the widgetā¦
ps. The plan I had was to make something like @cfausto suggests. A simple thing that intercepts the midi flow (a ctrldev?) and display the chords that pass. But Iām beginning to suspect that that is a bit naiveā¦
Perhaps I must first find a schematic design of how the software architecture of Zynthian worksā¦
Widgets are UI elements. The work is always done by the engines.
Ctrldev drivers and widgets are unrelated concepts. Donāt be confused!
CTRLDEV DRIVERS
A ānormalā ctrldev driver is an adapter that allows to use a MIDI device for controlling zynthian. It effectively turns the MIDI device into a Physical Zynthian UI. This UI can be more or less complete depending of the MIDI device capabilities and the driverās implementation.
The typical flow in a ctrldev driver is:
MIDI input from device => CUIA and other API calls => (optional) MIDI feedback to device
and optionally:
Zynthian API signals => MIDI feedback to device
Ctrldev drivers are bound to a specific MIDI device and normally doesnāt know about engines. Very rarely a driver is bound to specific engines (see MOTOR driver for an example of this).
UI WIDGETS
An UI widget is a graphic panel embedded in the UIās control view that:
Displays info from engineās monitors (engineās feedback) and parameters (engineās controllers)
Optionally, it can capture user actions in the panel and call core API functions and rarely, CUIA.
Widgets are designed to interact with a specific engine/plugin or a subset of engine parameters.
As you can see, there is little relation between them as they work at very different levels in the software stack
I was thinking of a chord transformer would be useful as well, capable of replicating one input to many channels, and with small chord variations in each instance, drive some complex melody from just one single input. Would be super cool if the UI could act separately for each instance, whilst maintaining an input window indipendent from the chord analyzer-transformer process, and add a unique output ID that can be read and maybe even merged together to form a midi-chain with midi transformers -splitters-mergers, with round robin feature, voice limiting. I think poly midi2.0 protocol
is what Reason uses for this, but itās not under GPL3.0ā¦.
Anyway, applause for the effort, Itās a wonderful concept per se. (by itself)
I will look at it. I think there is a misunderstanding but there is also some useful stuff in the ticket. Leave it with me and I will get back to you after I have time to investigate.
[EDIT]
I have updated the ticket on dusk-audio to explain the conflation of 2 separate issues. The original issue of the āBypassā parameter not working remains outstanding but was not properly described in the ticket, hence the ticket was closed.
I have added a ticket to zynthian issue tracker to change the behaviour of jalv to do the correct detection of āenabledā property.
This hangs in the Adding Processor step with following detailed UI log. I am not sure if Zynthian changes are needed to support this version first, or if there is an issue with the change.
Note - same test results in V4 kit and custom box.