Midi player/ sequencer / looper program?


Thanks for the info.
I planned to use the GUi version on my linux desktop pc to get seq64 working and figure out the midi controls, in order to use those settings on the Zynthian. Currently it crashes with some gtk error when I use menu:File-Open. Have to investigate that further.
I’ll try you approach to work directly on Zynthian with seq64-GUI.

Earlier I tried to get a Synthian graphic desktop manager on my linux desktop using rdc and vlc protocols, but failed. One test I got just the ZynAddSubFX window remote.

Yesterday I installed the NoMachine remote desktop (free version) on both my linux desktop pc and on the Zynthian (the ARM7 package), and painlessly got the Zynthain UI screen on my kubutu desktop at the first connection; could control the UI using the mouse.
NoMachine has its own integrated xWindow server, but release notes indicate that it is quite a heavy load on a Raspberry and is just workable on a Pi-3. For now, this is workable.

Now have to find out how to get rid of the Zynthain UI and get a linux desktop, then I can go playing with seq64 gui version on the Zynthian… using a normal display, mouse and keyboard.

b.t.w. I’m using the old trusty ‘mc’ file manager (2-pane full screen console mode) to easily get around the Zynthian disk in a remote ssh console. Lots of interesting stuf to discover.


Yesterday I got it runnig with visible windows on my linux desktop pc (hoera!)

  • did build version with gui( " ./bootstrap -er -rm " and the rest of the build actions),
  • started the NoMachine remote terminal on my desktop pc,
  • connected it to the Zynthian (there, NoMachine free edition for ARM7 had been installed earlier)
  • et voila, the SEQ64 window appears in the NoMachine window on my desktop pc.
    The file menu opens normally, can see my 2 connected usb midi devices in the Options menu’s for midi clock, Midi Input, and the jack Tranport page is visible too.
    In other ssh console the seq24 midi ports are vidible using “jack_lsp”

Now I’ll have to find out how to work with seq24, get my Korg nanoKontrol midi controller working for seq64 …

[4-6-2018] actually, the NoMachine remote desktop is not required to get windows of graphic programs like qjackctl and ZynAddSubFX on your host screen, but it was the first time i saw a copy of the Zynthian screen from the rPi on my desktop pc.
[6-6-2018] see “Starting the Zynthian UI”: you must stof the Zynthian service and start the Synthain script in a erminal session started with “ssh -Y …”


How about pure data? I found this one: https://github.com/karl-potisepp/pd-midilooper


I did have a look at PD before, It’s flexible enough for my needs, and there are more patches that provide a bit of sequencing. But I don’t like the graphic programming, the time I need to learn this new programming ‘language’ (like to spend more learning to play piano and music), and still have to link to synth engines to get ‘normal’ drum and instrument sounds.


Hi @FransK!

Csound and Supercollider are in the way … perhaps they are better suited to you :wink:

Is there somebody wanting to do the task? It’s a really simple task because the puredata engine is very generic and could be used as a base for building other similar engines …

Take a look to the code:



I’ll have a look at it.

A month or 2 ago, I had been looking at extending FoxDot (a python-based tool for real time musical programming in a python-like syntax using SuperCollider as backend to do the actual work). it lacks support for midi control messages, a way to switch pattern sets at the end of a bar, getting song position back from SC (to show it to the user), loading midi files, and output to midi instead of SC synths.
This might be added, or it can be a good example how to use SC as a backend (the source code is reasonably easy to follow).


Had a look at the SC documentation again, especially “using patterns” in the Getting-Started section and the (some of) the many parts of the Pattern Guide Cookbook.
In deed, this might be better suited to my needs than the cli-version of Sequencer64, needing preparation of the seq64-midi-songs in the gui-version of the program.

On earlier attempts I could not get any pleasing sounds out of SC, but sending midi notes to my keyboard / piano or a a Zyn synth solves that. Even could grab some of the Sc-patches that FoxDot uses.


I want to build raspberries to use as a midi autoacompaniment yamaha player style http://www.1manband.nl/ombl/index.htm with sampler like zynthian how can get it ?


I have no plans for integrating 1manband software in zynthian, but you are invited to do such integration :wink:



ihave plans for integrating 1manband software in zynthian ?


As its not got a display mechanism as far as I can see, so that would help issues.
I like the use of the top key as an Attention (or mode) key to operate control functions.

Your main concern would be getting the MIDI data from the keyboard into omb and getting the MIDI data out of OMB into the zynthian, so you would be involved with JACK which is slightly different from the normal zynth operation which is turning MIDI messages into Audio.
I don’t know how much OMB works the CPU but perhaps the quickest approach would be to put it on a separate raspberry pi with a MIDI in and run OMB there and then use the QMIDINet interface on that PI to talking over an ethernet connection ( The Pi does auto Ethernet cable cross over so you would not need a router ) to another PI running zynth.
That way you don’t have to play with the complexities of dodging round the Alsa/JACK Zynthian MIDI Router world. It would get you going pretty quickly.

It may seem odd to do it this way but it would probably be the quickest way to get things going.
I don’t know it the cost of a second Pi would be prohibitive, I hope not. I can see the attractiveness of direct integration but at the moment I suspect the MIDI router is fairly dynamic so until the mechanism for processing MIDI becomes stable it’s probably best to not get involved unless you REALLY want to get your hands dirty.
I’m not speaking for the project here, just observing what I think is going on.
Quite how jofe is integrating MIDI is best left to him at the moment. IT would seem that MIDI processing apps ( which omb seems to be) would need their own conceptual zynthian component a bit like audio aux busses and I don’t know if that is on his mental wish/todo list.

Hope this helps, and if anyone thinks I talking rubbish then please say so with as much rude abusive sound effects as feel necessary . . .