Hello to all you wonderfully enthusastic Zynthanieers. Have you ever wanted to get involved with the project or payback our wonderful developers by helping but are not gifted (or cursed) with the ability to code? Maybe your talents are more artistic or literary. You are always welcome to provide samples of your audio work but how about writing some guides for some of the engines (processors, synths, etc.) that you use and love (or loath)?
There is a section in the wiki dedicated to descriptions and user guides for individual engines. An engine is what we call the unit that does some processing. It may be an audio processor like Dragonfly Plate Reverb or a synth like Pianoteq. It would be great to see this section grow and flourish and maybe even for some engines to have their own sponsor - a user who is so passionate about the engine that they want to act as its embassador, sharing the love with the community, having a sense of ownership and care for its documentation.
I would recommend that anyone who wants to write some guides first requests access to the wiki and then creates a page for the engine so that others can see it is being worked on. (Let’s avoid duplicate effort.) Ideally the guides have a similar structure. I suggest:
Screenshot (if appropriate, e.g. it has a GUI widget - like audio player, sooperlooper, etc.)
Short description (a few sentences describing what it is)
High-level user guide - modes of operation
Detailed user guide - in depth description of the more complex or intricate apects
Typical workflows - examples of different ways to use the processor
You can use descriptions (or snippets) from the upstream projects if their licensing permits and you can link to their documentation where appropriate which should reduce the amount we have to write. Indeed if there is good upstream documantation, we should use it because it may be more likely to be maintained.
Of course these are just some suggestions. Some may not be appropriate and some may want to be added later or modified by others with different experience - particularly the workflows where we have many different use cases for the same tools. (Our classical composers may use compression quite differently to our drum and bass producers!)
To add a new page, edit the list to add an new entry, save the list then follow the new link you just made. It should open the new empty page. You may want to look at other guides to get an idea of the typical layout. Currently only the Audio Player has such a guide and that is more of an aspirational proprosal than a user guide (but I wanted to put it somewhere safe…).
Remember that you don’t have to code to contribute and that if you do code then you may also create documentation (actually you should) and we are all here to make sweet, sweet music so keep those examples of success flowing.
I was just gonna spend a few minutes doing a basic Pianoteq entry cause I just walked @SlyGuy through that the other day and it’s not trivial to setup, but I’m confused about logging in - usually there’s a “Register” option alongside login options but there does not appear to be one.
I am new and trying to get acclimated with available info (theres alot!) but would love to contribute in some useful way.
I would certainly benefit from visual representations of / walkthroughs of various functions or typical workflows. I am finding some of that in the wiki menu.
I was introduced to the community through Floyd Steinbergs videos.
I am most curious, atm, regaurding the web interface/ and second screen/ UI , engine? I haven’t seen any of that in video or image form.
Is it possible to view the vst in its native gui by one of these methods? Im thinking primarily of synths like surge and dexed possibly.
looking forward to much elucidation and hopefully will evolve to be a contributor at some point.
that should get me pointed in the right direction.
Am I correct to infer that VNC is the “best” way to view plugin native UI as opposed to the X server method?
This is a “live” gui in that you can actually change instruments/ params etc?
I got the impression that the x method had terrible latency and that VNC is acceptable?
VNC is how I access Zynthian for a lot of the development I do. When VNC is enabled it presents 2 VNC views:
The main UI mirrored from the Zynthian screen
A desktop that shows engines’ naitive GUI (if available)
VNC is accessible directly with a VNC client or within a web interface from a browser. I use the latter as it is more convinent. (There is screen distortion, e.g. tearing in the current version which should be resolved in the next release. It is a pain that I am willing to accept for the convinience of sitting on my sofa whilst coding!)
Please be aware that VNC adds significant load on the system due to various reasons including the engines’ naitive GUIs using extra resources. We strongly recommend disabling VNC during normal performance use.
I love the concept of the X-windowing system but the implementation (at least on Zynthian) has poor performance so I recommend avoiding it. (I used to use it all the time but now much prefer VNC.)
I use VNC1 to interact with Zynthain’s normal UI. I mostly use computer keyboard bindings to control it.
I mostly use VNC2 to access Patchage which gives access to the low-level routing graph. You should not be playing with this but during development it is a useful tool. (Although the version we have is flawed and needs a restart regularly - again I think this is fixed in the next version.) I sometimes use VNC2 to look at engine naitive GUIs but mostly for testing development. This is where you could access the full features of a synth like Surge.
Due diligence demands that I say vehemently that I am not a coder and my terminal access began with text to speech shenanigans and ends with limited windows cmd prompts sadly. I havent given up hope that I can expand my utility, the desire is strong.
That being said, would it be possible to implement something like the new Yamaha Seqtrack function? I believe it is bluetooth Midi (I think) , wifi or both? In which they can access parameter change, transport and input via live wireless interraction with the device over android and IOS app.
I am interested in Bt midi for controllers as well, so I would be curious of Zynthian v5
BT capabilities in that regard as well.
I haven’t found anything regarding BT in wiki or discours yet.
BLE MIDI is coming in the next stable release which is probably a few months away. I don’t know anything about Yamaha Seqtrack.
[Edit] Following @UnkleSkunky’s reference to the Yamaha Seqtrack, I watched a bit of a YouTube video and see that it is basically a hardware version of what Zynthian can mostly do already. There is a wonderful development by @hyves42 that implements step sequencing, Zynthian control and other features on an external hardware controller and it would be great if we could fold some of that back into the Zynthian UI as well as implement on more hardware devices. This may provide the rest of the functionality you like.
Regarding BLE support - I wrote a Zynthian device driver and firmware for a smartwatch (riband) that allows pretty much full control of Zynthian. This same driver could be used for other MIDI connected devices including BLE MIDI devices. (I think we should aim for some standardisation and common code in these drivers where we can.) A BLE MIDI device should be able to interface with the next version of Zynthian fully although each individual device might need a driver or some customisation to an existing driver.
Awesome! I spent some time on the requests etc post but didn’t see it.
Seqtrack is a limited sort of groove box but it has some cool features that allow a dynamic wireless GUI (on android and ios app) and manual that when you fiddle with controls it tells you about that control. I can’t say I’ve ever seen anything like it. Oddly enough you can’t stem out track midi. They say it sends audio with note information but I don’t really know what that means. I’ve not seen what that output looks like.