Using MIDI controllers instead of Encoders

We fear change…

Not really, it all came about with the inevitable division between various commercial organisations.
A Bit got written…

https://wiki.zynthian.org/index.php/MIDI_Network

I can’t remember where I first encounted qmidinet but it was implemented on the zynth way back with RTP-MIDI following on behind. It allows you to pick up alsa or jack pipes at any host within the local subnet and present it as a MIDI stream. It simply extends your 16 channels of MIDI on a zynthian to any connected machine.
Make a noise near you at pretty much the same time as you make it over there…

It means you don’t need audio on stage for MIDI instruments.

I’ve never used RTP-MIDI so don’t know much about it.

Long sleepless nights on google and a lot of studying.
In my case:
Raspberry Pi with runing Bome Network (based on rtpmidi), over it is connected Bome Box. On the Bome Box are connected 6 midi controllers and 8 synths via USB (and one Blokas MidiHub). A Translator file is uploaded to the Bome Box, which is controlled using OSC via the Open Stage Control server, which also runs on the mentioned RPi. On RPi is also installed McLaren rtpmidi networking utility over is Zynthian connected.

1 Like

And these network midi devices, they respond at acceptable latencies? It’s hard to conceive, though also far from impossible I know, if the network is not congested.

It should work now, at less the default mapping. Webconf’s keybinding panel doesn’t work yet.

Regards

1 Like

If you’re referring to testing, I did some keybinds yesterday while finishing up my box, which is fully functional now. So… dunno what to tell ya, worked for me. :>

edit: my piano player is extremely excited. :>

Emotionally, there is nothing to know. I didn’t measure it exactly, apparently network midi would be faster than via usb. I use it on my home network, everything is wired, no wifi.
One of my combinations is Akai EWI USB → WIDI Uhost --(over BLE midi)–> WIDI Bud Pro → USB host → second USB host → Bome Box (translator manipulation, change of midi channels) → second USB host → MIDIHUB Blokas → XO-mini Dynasample synth and it works. In the Bome Box, there is possible a split (better copy) of the signal and the path also can be Bome Box → Bome Network → Raspberry Pi → saving to a midi file.
According to the creators of Bome Box, Bome Network should access usb ports directly. There is now a bug in the RPi version, direct access to input ports is not possible when Bome Box Translator is running. However, the signal to the output ports can be sent, so midi or sequencer playback on the hardware synths works.

Last time I’ve used the network for controlling Zynthian was for testing this music pattern generator:

it has work like a breeze and was very fun.

What wonderful instruments… they were the ones that the Greats had and we “little ones” had to settle for Crumar keyboards… my god :roll_eyes:

Did anybody try the 57: SELECT?
I am using the drumpads of my Akai.
Other mappings work, but I don’t get the select to work.

If it’s a bug, I will open an issue.

Cheers,
Markus

Hi Markus!
Check the updated CUIA list:

https://wiki.zynthian.org/index.php/Accessing_Zynthian_from_your_computer

Some CUIA need parameters. SELECT, for instance, need the list index to select. I don’t think is a good idea to map it to a midi note.

If you want to emulate the select switch action, you should use ZYSWITCH CUIA, passing 3 as parameter. If you do so, you will get full switch emulation, including short/bold/long action. You can do the same with all the zynswitches (0-3).

Regards

1 Like

THANKS!
That’s great. I assumed, Zynswitch were the new buttons beyond the 4 main encoders…
Now the BACK works better as well :slight_smile:

2 Likes

What Akai model are you using?

FYI, I’m working on a controller driver system for zynthian. It’s almost finished and it will allow to have device specific “drivers” including customized maps, led feedback, controller feedback, etc.
So you plug your launchpad mini mk3 and it perfectly integrates with the zynpad , managing pads and scenes. You plug an Akai MIDI-Mix, and it perfectly maps the mixer, including mute/solo with LED feedback. You plug a Beringer motor keyboard, and the motorized faders get mapped to the mixer ones. Etc. The “driver” model is simple enough to allow skilled people to contribute drivers and create a good collection with the more popular devices well supported. Stay tuned!!

The best!

5 Likes

This might get @wyleu off my back about his “one pixel display”! :smile:

1 Like

Admit it you thrive on the attention…

1 Like

Wait, what ?!
That look great!

Feedback is hard if you do it the midi way. For encoders with led rings it’s “simple” as listening midi and give it back to the interface, but for the rest of the features it’s not !
And the plug and play in opposite to “midi learn aaaaaaal the controls” is NICE !

PS: just curious of one thing… is the “driver” really a driver or it’s an other thing ? As I understand, if a plugged device is recognized, it as a Linux driver somehow.

1 Like

It’s not a kernel driver but a simple python file that implements a class. Easy!

1 Like

I meant Alesis V25. Not many knobs here.
But your new toy might be perfect for my Arturia Keylab Mk II

1 Like

I’ve just found this. Looks like a great way to do a different implementation of the V5 control set.

I’ve got a bunch of mechanical keyswitches, and have used addresseable RGB LEDs a lot. I also have several RGB midi keypads.

So my only real question is how would we get the button LED states out of the Pi and into a midi controller?

2 Likes

Using MIDI messages, obviously :grin:
You can implement as you like, or perhaps adhere to MCU protocol.
For instance, my launchpad uses MIDI notes for most buttons: when the button is pushed, a note-on is send from the controller. When button is released, a note-off. The LED state for the button is send from the “host” using a note-on (same note). In this case, the velocity encode the color/bright info.

CC messages can be used in a very similar way.

Alternately, you could use SysEx.

MCU uses a combination of everything. For instance, fader state for each channel is sent with pitch bend messages, what gives 14 bits of precision instead of the 7 you get with a single CC message.

Regards

1 Like