MCU/Mackie Control Support for Zynthian

Over the last few days, I’ve been working out support to speak and read the MIDI-adjacent protocol variously called MCU / HUI / Mackie Control / Logic Control to my Nektar Panorama T6.

My eventual goal is to create a system that can use the MCU/HUI system in a middle-to-high end MIDI controller to produce a more-deeply-integrated control surface with zynthian.

So far, I’ve gotten the MCU system controlling the keyboard partially, and will eventually generate a system that uses configuration files for different keyboard models and instruments to map controls.

Does anyone else have any interest in a system like this? I will probably finish it for my own purposes, but wanted to know if the community is interested in this topic more generally. The goal is to be able to produce a “headless” zynthian setup that uses either no screen or the built-in screen in your keyboard / midi controller.


Hi @jkafader! Welcome to our lovely community. I hope you will enjoy your time with us. What you describe has many similarities with various ideas discussed in the forum. Take a wander through the rich history of discussing here to gain an understanding of what has been considered and what had already been achieved.

We are heading towards implementing a more complete API which is likely to be of particular interest to you. (There is a forum post on that subject.) Currently we have MIDI, USB HID and OSC control as described by our CUIA (light API).

Some work that I am currently embarked on is scheduled to deliver a well documented method of interfacing with Zynthian. That should land within a few weeks (at least in the testing branch) so watch out for that.


Hi @riban ! Thank you for your very thoughtful reply to my post.

I also noted that in my excitement, I didn’t convey my thanks and feelings of overwhelming gratitude for those who have already contributed so much to this project. It is amazing! I’ve been looking for something like this for a long time, and am very happy to help as I can.

I had searched the boards and couldn’t find anything on Mackie Control / MCU but I’m now delving into the related topics you posted on. These are, for my own edification, and possibly for that of others, some of the posts that detail some things that you were referring to.

Currently we have MIDI, USB HID and OSC control as described by our CUIA (light API)

OSC control:
This appears to be another semi-standard MIDI-based control protocol that’s used by touchscreen apps such as TouchOSC for iPad/iPhone

I think this refers to Human Interface Device work, I haven’t been clearly able to find an entry point for this

This refers to a zynthian-specific set of commands that can be delivered either via MIDI or OSC. This protocol appears to be unidirectional (“In towards zynthian” only) rather than bi-directional (“In towards zynthian” / “out towards device”)

Just to add some details on the thread:

I spent a good portion of my holiday break on this project. I currently have outlined the following things:

  • a bi-directional MIDI Sysex API that currently delivers details on the following:

    ‘HANDSHAKE’ # ensure API is connected – this should delay until a specific start point
    ‘QUERY_NUM_CHAINS’ # get number of Layers/Chains
    ‘QUERY_CHAIN_ENGINES’ # get the engine short name for each Layer/Chain
    ‘QUERY_CHAIN_CONTROLS’ # get the full list of CC#s, min, max, curval and name for all controls for layer
    ‘QUERY_CHAIN_CHANNELS’ # get the MIDI channel for each chain (so we can send CCs and set things properly)

  • changes to the autoconnector code to deal with the jackd option -X seq rather than -X raw. The autoconnector doesn’t assign a good name to the devices returned by the ALSA seq mode. I defaulted it to using the UUID assigned by jackd. I made this change to my local copy so that I could route one of the 4 MIDI ports to zynthian, the other to my custom MCU script. For reference -X raw locks MIDI devices so other MIDI libs can’t make connections to the ports (or at least I couldn’t figure out how to; getting permission denied)

  • A client for the above Sysex API, the client side of the same functionality

  • an MCU controller for the Nektar Panorama T4/T6 that speaks the specific MCU messages that update the display and set the appropriate virtual CC values

Currently, my MCU project is doing the following:

  • Stable MCU connection to the connected controller
  • Change between Layers/Chains in current Snapshot via CUIA API
  • Queries the Sysex API for in-memory details of the current, max, min, CC# of each of the controls for the current Layer/Chain controls, assigns an unused MIDI CC# to controls without one currently.
  • Display appropriate controls for each of the Layers/Chains on the MIDI controller’s screen via MCU
  • Uses soft buttons to switch between set screens of controls for each of the Layers/Chains

Still to-do:

  • Map the current values returned by the Sysex API to the MIDI controller’s initial in-memory control positions
  • Wait to send CC values until the faders/knobs match the in-memory position in zynthian (“control snap”)
  • Correct the realtime MCU connection process to poll until ready for both the MIDI Sysex API and the wait for MIDI controller physical connection
  • Create a systemd service file for the MCU connection process

OSC is a protocol for control and monitoring of audio devices. It may provide similar features as MIDI (if implemented in the device) but is an extensible protocol that allows manufacturers freedom to implement what they want, how they want. It is essentially unidirectional but can be implemented in a way to provide bidirectional control and monitoring, indeed Zynthian provides control and monitoring of the mixer via OSC.

USB HID is a standard for connecting human interface devices to computers, e.g. keyboard, mouse, etc. Being a universal standard it allows such devices to be connected to most (all) operating systems because the OS will implement the drivers for this type of device. Zynthian allows mouse and keyboard integration, e.g. keybindings to control various aspects of the device. I tend to refer to HID rather than keyboard because the word “keyboard” has a different meaning in music. (A keyboard is a manual or pedal board that allows control of an instrument.) I also refer to QWERTY or computer keyboard. The former may offend some of our community that do not use QWERTY, e.g. French, German, etc. No offence intended, I just want to differentiate from USB MIDI keyboards. The default keybinding is detained on the wiki and can be modified in webconf. This is mostly unidirectional but there are some elements of feedback. (Some may only be in development branches.)

CUIA is an early attempt at a kind of API. It is a set of actions that may be triggered by various inputs, OSC, MIDI, physical buttons, etc. This is also detailed in the wiki. It is unidirectional.

I think OSC may be the way to go with this. We can send OSC commands to trigger any CUIA and have full control of the audio mixer. We can register via OSC to receive tallies (feedback) of mixer state including audio level meters. In the future I expect this to expand to a more complete API for control and monitoring of the device.

1 Like

@riban Thank you again for the extremely timely and thorough description of those. I’m sure that will help future searchers understand the state of those in greater detail as well. I personally will look into the OSC protocol for Zynthian in greater detail – some of the stuff I was looking for will likely be there too.

Again, I just wanted to say that this is an incredible community and an incredible project. Thank you!

In terms of longer-term “real” API development, I’ve been wondering whether the intention is to head down a path like gRPC / protocol buffers, or whether there might be something more like custom serialization via e.g. sysex / OSC primitives. I was thinking through writing something like this via OSC and it would seem that bi-directional communication via OSC would involve the client-side application also running a server and, essentially having to deal with the details of asynchronous communication (which are a pain).

The upsides of gRPC would be a compact binary protocol with more straightforward client side code, the downside is that protocol buffers and the related configuration can be a crazy pain unto itself for maintainers of both the client and server side code. (This situation may have resolved itself since I last dealt with protocol buffers in python)

1 Like

RPC tends to be slow. It does depend on implementation but where we have seen this implemented, e.g. recent changes to Pianoteq have often used a slow (e.g. 50ms) processing loop which can add significant latency to messages and actions. It is a matter of choosing the right protocol based on requirements. I have done a lot of control and monitoring programming and development of various systems. There is often conflicting requirements: you want low latency control and monitoring, sometimes to a large quantity of clients (not necessarily an issue here) with high reliability with handling of delayed and out-of-order messages, e.g. you may not want to set a value if the message arrives seconds later! The reliability comes from some form of handshaking which adds to the complexity of messaging, especially in a large, distributed system. As I say, that is less of an issue with Zynthian and similar systems. OSC is a high-level protocol that may be carried on any suitable (or unsuitable) transport layer. The only transport I have seen
used is UDP/IP unicast. This works well to deliver messages promptly to a small quantity of clients but does not have handshaking. I have heard this coined as send and pray. There are other delivery protocols that provide reliable delivery, e.g. TCP/IP which has a massive overhead and undefined latency. There is a protocol that provides some of the TCP reliability over UDP with guaranteed reliability and latency figures that I was looking at for another project. (I can’t recall just now but might be DCM.) Anyway - the low-level transport is a consideration that is kinda separate from the high-level protocol. I quite like OSC because it is targeted at our use-case (control and monitoring of audio devices), is simple and extensible, has open source libraries and is already in use in Zynthian. It has relatively low latency and on a small, closed network is quite reliable. (Most/all unicast packets will reach their destination if the transport remains intact. Wifi may disappear randomly but the OS generally buffers messages during temporary loss of connection.)

Check out this post for discussion on API (high-level stuff). I am currently mostly concerned with shaping the core architecture to allow such a high-level API to access core elements. I defer decision on low-level transport until later but for now (for test and current stable operation) use OSC over unicast UDP/IP. It is working well enough and that is what we want. (No need to over engineer a solution. Pragmatism is king. Let’s deliver something that works rather than not deliver something that works too well!!!)

We use tinyOSC which is a cool and simple implementation of OSC over UDP/IP. We could send tallies (feedback) via broadcasts or multicasts but there are issues there which I am happy to describe, but not now… other things to do!

1 Like

HI @jkafader !

It’s amazing what you are implementing. I’ve a Behringer Mötor 61 and i would like to adapt your code for working with it. In fact, Zynthian should implement a MCU API, as it can be considered a kind of DAW (u-DAW?).

Perhaps could you send a small video showing your Nektar-Zynthian integration? I’m pretty sure that some MC-enabled keyboard owners would be interested on your work. Perhaps Mr. @wyleu? :wink:

BTW, this is the manual for the Mötor 61, including the MC specs:


Hi @jofemodo!

I happen to have a Behringer Mötor 61 here that I intended to use for this project originally – however mine was delivered with a defective keyboard action (a few of the keys don’t have correct velocity measurement).

I’ll take a crack and adding support to my script for the Motör 61 specifically. For the record, I think MCU is only semi-standard and definitely each controller has a unique set of foibles.

For a progress report, I’ve been continuing to work on this, and now have the script for MCU control more or less finished, just trying to get it to very high stability, reorganizing the MCU client and the API client as state machines w/r/t their MIDI connections so we can deal with all of the connection logic (and connection degradation) in a central event loop.

I’ll be putting demo videos of this work together in the next couple of weeks, and will try to hopefully put some music together as well, so it’s not just like talking about coding but actually kind of doing some patch creation and showing that that’s easy and sounds good.

Thanks again for all of the amazing work on this project.


1 Like

I think this is “by design”. My Mötor 61 has not the best key action neither. Anyway i found it quite useful as a Control Station and as a testing device for zynthian. But playing with it is not a big pleasure.

Yes. It seems we would need a controller driver for each device, but it could be done step by step once we have a good base for that.

I will be awaiting for your progress, videos & noises :wink:


1 Like