Network Audio

If I have several audio devices connected to an IP network I would like to be able to route audio between them without the need to plug audio cables or connect via a mixing desk. An example might be that I have two Zynthians and want to take the audio from one and route it to the other. Maybe I want to use a common reverb for all my Zynthians or I may be using one as the main mixer without an outboard mixing console. It would be good if there was little (or zero) configuration required and that Zynthian detected other audio devices automatically.

There are a few ways to do this which include:

  • netJack1: This supports low-bitrate codecs but requires source and destination to be defined at start-up (no discovery option). I don’t this this is a good match for this requirement.
  • netJack2: This allows an arbitary quantity of clients to connect to a main device with automatic detection (within same LAN segment). Each client may have the quantity of network inputs and outputs (audio and MIDI) defined as well as its client name (defaults to hostname). This looks appealing but it loses JACK ports when disconnected from main device which will lose routing within current Zynthian implementation - this could be resolved in Zynthian. There must be exactly only one main device on the network which may be limiting, i.e. this is server / client, not peer-to-peer. A client is already easy to define by selecting “Custom Device” soundcard and using JACK parameters -d net -C2 -P2 -i1 -o1. To enable the server side we need to add a feature to load / unload the netmanager internal client (which is fairly simple) and then offer ability to route from JACK client ports called, “from_slave_1” and “from_slave_2” which is a fairly trivial tweak in Zynthian Audio Router. This looks quite a good match for this requirement, requiring only small changes to Zynthian.
  • jack.trip: This requires destination to be defined at start-up (no discovery). I don’t this this is a good match for this requirement.
  • zita-j2n: This requires destination to be defined at start-up (no discovery). I don’t this this is a good match for this requirement.

There may be other protocols that we could look at, e.g. AES67, Dante, etc. These may improve interoperability. There are some opensource AES67 implementations like this one.

This may be a niche feature but I do see some wider uses, e.g. connecting to digital infrastructure (probably requires standard protocol like AES67) or Internet connectivity for rehearsal / performance (may require data reduction using current Internet technology).

3 Likes

Certainly this is an issue that has had me musing for some considerable time. My main use case would be transporting audio from an input card(s) an one stage microphone or guitar ,to a remote zynth that could treat this source as an incoming layer.

Quite how much we contain this is probably a useful set of limits worth imposing.
I would tend to support transportation of control signals around such a network with audio rendering taking place at the output device if at all possible. Processing power will limit that so another element would be a render farm of output zynths that might combine a multitude of sources down to one (or more) output structures. There is a pure approach where all such work should be conducted entirely in the digital domain whilst the pragmatist in me might well favour multiple analogue outputs feeding something like the Behringer 1820, with a zynth managing it.

I’ve run some fairly complicated MIDI over ethernet rigs and it’s impressive what can be achieved but when it goes wrong it goes VERY WRONG ! Die midi howl around die !!!. I’ve built reboot facilities into the pedal board for such an occasion and I’d probably like to see some declared fallback position that would at least allow critical components like front of stage microphones to allow an apology to be delivered without frantic re patching, or menu flapping or possibly worse the provision of a completely separate analogue chain just in case ( I did that on the pedal board too…)

2 Likes

If you load the audioadapter module on the client then you get access to its soundcard which allows the workflow that @wyleu mentions, i.e. routing audio from one Zynthian’s audio inputs to another’s audio outputs (or any other Jack destination).

This definitely needs a good, solid network connection. I just heard some interesting effects when on WiFi with the pitch shifting all over the place!

We would probably want better / different audio routing control if we go down this path. I think this may fit nicely with the zynmixer. I can imagine having a main Zynthian acting as my mixer and synth with other Zynthians feeding into it. Those may be played by me or other people. The Zynthian can become a digital stagebox. With a bit of thought you could have input and output on each Zynthian allowing personal monitoring, etc. The latency can be very low on a good network. There is a configuration that ensures all processing is done within the same jack frame so there is not added latency.

Also… it passes MIDI :smile:.

1 Like

Okay - I couldn’t wait for others to reply so I have added netJack2 hub as an option in admin menu on a new “netJack2” branch. I use the word “hub” to represent the one (and only) Zynthian that should act as the main / hub device to which all client Zynthians (and other netJack2 enabled devices) can connect so only enable this on one device on your network.

Because JACK2 service does not wait for network to be up before starting, setting a Zynthian as a netJack2 client (by setting soundcard jack parameters to -d net -C2 -P2 -i1 -o1 may cause it to stutter during start-up. I haven’t tested the client extensively but it seems to work okay on my Pi3. What doesn’t work on the Pi3 is enabling the soundcard whilst in netJack2 client mode. This Zynthian only has headphone soundcard enabled and I haven’t tested with other soundcards - that needs some further investigation. This would only be an issue in @wyleu’s scenario where you want to use the Zynthian as a netJack2 client and use its physical audio inputs / outputs.

With this branch you can enable netJack2 hub on one Zynthian, configure another Zynthian as a netJack2 client and pass audio between them using each engine’s context menu for audio routing. Note: On the client Zynthian the netJack2 connections (to the hub) appear as its default audio ports so it does not need specific routing. Just use it as a normal Zynthian with its input and output appearing on ports on the hub Zynthian. Also note that sending audio from the hub to a client only passes the full stereo pair, i.e. you can’t currently route individual mono channels in that direction. This is similar to a stereo plugin but maybe should be investigated.

I have observed that increasing the quantity of netJack2 channels does increase load on JACK and there is a point at which it starts to xrun. I have not observed that with stereo connections as described in this post but more esoteric configurations may start to vex it. I have not tested with Pi3 as the hub so am not sure how it will behave. (My Pi3 and Pi4 are at opposites ends of the house on different floors and there are only so many simultaneous remote session bounces that my brain can cope with!)

There are other applications and devices that could utilise netJack2, e.g. you could connect your Zynthian over the network to an instance of Ardour or run a rack of plugins on a powerful desktop with send and return in Zynthian audio chain or vice versa.

If you have network devices that run jack2 and want to join them to your Zynthian you can give this branch a bash and report back. If it proves useful / desirable / easy-to-support it might make it to the main branch (but hopefully only after we have an actual stable release - we can all dream :sleeping:).


[Edit] I have not added any code to handle MIDI routing yet. If you want to play with that you will need to roll up your sleeves and delve into the innards. I will add MIDI routing when I get a moment.

4 Likes

For zero latency*, add the parameter, -m fast to the client JACK configuration. The hub will wait for the client’s data each cycle which increases the risk of xrun on hub if client struggles or there is network latency. I recommend only considering using netJack2 on a dedicated hardwired Zynthian LAN. Although netJack2 supports jumbo frames, the Pi NIC driver does not so there is no use changing the network frame size.

*Zero latency between client and hub. There remains the latency within JACK, ALSA driver, etc.

Oops! Old docs… Mode has been replaced by network latency setting which defines the quantity of cycles to delay so -l 0 gives zero latency. Default is 5 cycles.

1 Like

Not had a lot of success with this.

After changing the config, the PI wouldn’t start until I removed the audio-injector card, and the custom setup wont start.

Not sure I can configure an audio injector input and a network output…
Still attempting to configure a pure network output fro zynthian-pedal.local

That’s a shame. It worked okay in my (limited) tests. I mostly tested from a laptop running JACK but did tests between my Pi3 and Pi4 Zynthians.

What hardware are you using and what is your test setup? If you can describe exactly what you did with what I could take a look at what is going wrong. I can’t see why a Pi with JACK configured to start with net driver would care about other soundcards - indeed I tested this on my laptop with built-in sound and a USB soundcard plugged in and it still worked. I take it that you did reboot after configuring, etc.

Il have another try this evening. It’s pi4 with an audio injector card (zynthian-pedal.local)linked over to a pi 4 with a hifiberry dac/adc (zynthian-steel)

I was working on something related with this some time ago:

… the code is not finished and would need some update for working with latest changes.

The idea is running engines in several nodes (a simple RBPi with network connection). Audio & MIDI is sent using netjack and returned to the master node. When creating a layer, you choose the node from the available ones, etc. The UI is running on the master node, that control all layers: local & remote.

I can’t say the exact status of the development, but i remember having tested with a master + slave, creating remote layers and receiving the resulting audio on the master.

I would like to work on this again, but it’s not a priority and have no time for it … anyway, it’s a really funny development, jejeje!!

Regards,

1 Like

It looks like Jack does not create audio input or output ports until a client has connected to a server so the client running -d net will have trouble if it can’t connect to the hub, either due to network issue or the hub is not yet running.

If anyones interested, heres a project I was working on a while back, “BORTPort”, that basically multiplexes multiple midi ports over USB so you can build midi controllers that talk over ethernet.
It includes a protocol buffers spec so bindings for pretty much any language is possible. The current ‘driver’ relies on QT (QT has an amazingly intuitive network stack) but it wouldn’t be hard tor rewrite for native linux apis.

1 Like