I was hoping someone could give me an idea on where in the Zynthian source to add a plugin.
I’ve had some success before playing with device drivers (adding support for a MIDI controller or ctrldev driver), and maybe this is the easiest place to try to land this by just tying it to a specific controller I have.
So, I’ve been a little jealous of guitarists being able to strum chords, and was even pining over the Suzuki Omnichord. I realized that I have most everything in my Zynthian - just need a surface to strum on.
I bought a drawing tablet for $30 (Deco 640|Drivers Download | XPPen), and put together a quick prototype in Node.js. I have some things to tweak, but it works really well and could be super fun to play. On top of just strumming, I get pressure levels so I can control the MIDI note velocity. What’s also a bit neat is the I have both horizontal and vertical pen tilt, so I’m thinking of wiring that up to modulation and pitch control…it could really be expressive this way!
And of course, a simple MIDI keyboard controller is great for setting the notes to strum (I have some configuration to add notes in the scale above or below to maximize the “strings”).
The tablet works easily when I plug it in to the Zynthian and functions just like a mouse. On my laptop, I think I need the device drivers installed to get the pen pressure and tilt registering. So hopefully, one of the many Linux device drivers will work on the RPI.
But back to my question: where’s a good place to try to drop this in Zynthian? I was originally thinking a LV2 plugin, but as I read, it doesn’t sound like that plugin format supports USB devices. Also Python is easier for me than C.
Can I write just a normal MIDI plugin in Python? If so where in the source code would I look?
So, the reason I’m not leaning towards the control device drivers is because it relies on a MIDI device if I’m not mistaken.
When I was trying them before, I remember you have to specify the MIDI device ID. As diff devices get plugged in. If they match, then the control device driver is connected.
I was assuming that the device ID is a MIDI device ID that is communicated over the MIDI protocol and any USB device IDs wouldn’t be the same.
I COULD create a control device driver for a specific MIDI keyboard controller I have, and listen for the tablet in that driver. But I’d imagine that would be locked to my specific controller and not really a general use thing. I’ll probably do this if I don’t get better suggestions here.
ANYWAY, I am assuming a few things here - so feel free to correct if I’m wrong, but I don’t think the control device driver is necessarily the right approach
If you want to use the tablet to control parameters such as pitch, filter, modulation, aftertouch, breath, etc. then it needs to slot in before the instruments you want to control. You could do this with a “MIDI” LV2 plugin that has MIDI input and output and inserts in the begining part of the chain, but I think it would be better for this to appear as a controller device that you can send to any / all chains like a MIDI controller.
I would probably consider writing a low-level device driver that captures the tablet device and presents it as a MIDI controller, so that zynthian detects it and presents it as a normal MIDI input. The 3 axis (X, Y, pressure) could be mapped to 7-bit or 14-bit MIDI CC. (I would implement both options.)
If you are plugging it into the zynthian (via USB?) then by default, the OS will detect it as a pointer device which you need to intercept, otherwise zynthian will just handle it as a standard HID pointer.
You may gain some inspiration from a project I am working on to convert MIDI output to DMX control, jackmidiola although you may find this to be significantly different and of little help .
You are likely to need to roll up your sleeves and dive deep into low-level stuff. It sounds like a great project and I wish you luck with it. There may be some (much) benfit from @jofemodo’s recent addition to vangelis to provide a framework to write device drivers with scripting like mididings. It would hook into the right place, and you would write some code to detect the tablet and capture its data.
oooh, I really like this suggestion. It’s basically what I’m doing in Node.js already, and it’s something I can do off device on my laptop until I need to debug on the Zynthian. Blocking input will hopefully not be a problem, because on my Mac at least, if I connect to the device with USB-HID, it stops sending output to the OS. Hopefully it’ll be the same.
I guess, I’ll convert to Python, get it running as a service on my laptop and see how far I can take it!
Hey, hopefully you can offer advice again.
I converted the service to Python, got everything working. Seems perfect….except….
I can’t hear anything.
When I look at the MIDI log for MIDI-Through Port 0, I see great things happening:
CH#10 NOTE_ON 40, Vel: 28
CH#10 NOTE_ON 36, Vel: 77
etc…
But when I set up an instrument in Zynthian, I don’t think I have that as a device option.
I’ve got
Internal Devices
pi sounds MID (yes I have a custom build with a pisound)
DIN-5 MIDI
MIDI player
Step-Sequencer
CV/Gate
Usb Devices
Launchkey MK4 251 (yes I have a key controller plugged in)
Launchkey MK4 252
All captures are checked with an X to indicate on. I kind of assumed that maybe MIDI player would cover it. Now I don’t think so.
I’ve been using the web terminal to log in and start the service as I test from pulls on github to get it working. I’m wondering if the Python service needs to be started at boot for it to be a device that Zynthian recognizes?
This thing is SUPER fun and expressive to strum around with on my macbook, I hope I can get it on my Zynthian.
We do not use MIDI-Through. This can cause issues with uncontrolled MIDI messages passing through the system. Instead, each MIDI input device is presented as a jack MIDI device. For most hardware devices with an ALSA MIDI hardware presence, jackd converts the ALSA port to a JACK port. For device that are presented within userspace (not ALSA kernel module) we have to wrap this with a userspace process, e.g. ttymidi converts the serial port to a jack MIDI port.
Without knowing what you are doing or how you are doing it, I can’t advise on a solution beyond saying that zynthian looks for JACK MIDI ports only.
Thanks again! I got this working perfectly through Jack, I’ll share in another thread after some proper testing and making some documentation.
I think the remaining thing I want to make sure works is to allow this and another MIDI controller operate simultaneously. I think what I’m looking for is to switch the multitimbral mode of the device (after looking through the Zynthian docs). Unfortunately, my Jack device isn’t showing up in the device list.
Fingers cross that if I can get this booting on Raspberry Pi startup or find a way to refresh the device list I’ll just see it.
Thanks for the help! I’m really glad you gave me the suggestion of creating this like I did. It’s basically multi-platform and pretty fun to use with Ableton too!
Sure so I should warn you that while I’m a developer, Python isn’t my language and i’ve never touched working with Jack before (other than using it once or twice). So this is largely vibe coded.
But here’s the code. The auto connect logic seems to need to be fairly specific for Zynthian so it knows what named ports on the midi router it needs to connect to (even saying that I’m only inferring from code and don’t really know the system).
Don’t worry about coding skills. I too am not a native Python coder and had to learn this odd, scripting language, mostly for zynthian. We can spend too much effort worrying about what our code looks like (which is almost irrelevant) and too little effort worrying about what our code does (which is the prime directive!!!).
I took a quick look at the source code in your git repo and see that you have the option via config to use rtmidi or jack. I think (but didn’t look too closely) that rtmidi just attaches to the first ports it finds which is not appropriate for zynthian but probably irrelevant as I suspect you configure for jack.
To expose your jack midi ports, you need to specify is_physical=True in the port register call, e.g. self.midi_out_port = self.jack_client.midi_outports.register('midi_out', is_pysical=True. You may also want to give the port a more descriptive name which is the default that appears in the zynthian MIDI port list, e.g. self.midi_out_port = self.jack_client.midi_outports.register('strum', is_physical=True). Ports can be renamed in zynthian UI but it may help to start with a useful name. (This shouldn’t really be necessary nor is it really good practice, but it is how zynthian currently handles MIDI port names for clients it does not recognise. I may change that in the future.) You can then see the ports in zynthian’s MIDI menus:
[Edit] BTW the use of python MIDI is discouraged because it has poor real-time behaviour, leading to latency and jitter but to get this working it is a good execise and it may be sufficient. It only needs to be good enough.