I think for my immediate next steps on Zynthian, I should probably prioritize learning how to use that new session recording feature which was mentioned earlier so that I could make private / unlisted YouTube videos with a voiceover of exactly what I see happening and say what I’d expect or prefer to see happening instead.
you can add any sorting mechanism you want to sorted, just simply provide comparative functions. It’s timsort which is probably one of the most optimized sort mechanisms on the planet. Indeed Java nicked it for their implementation.
There is a very good article out there somewhere by Tim Peters in which he uses sorting as a text book example of why you should use sorted over other mechanisms.
Another bit of Tim Peter’s contribution…
The Zen of Python.
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren’t special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you’re Dutch.
Now is better than never.
Although never is often better than right now.
If the implementation is hard to explain, it’s a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea – let’s do more of those!
I remember Perl, shudders…
Even though I use Python where it seems the appropriate tool for the job, I’m not a fan of using whitespace to express a superstitious fear of brackets, especially when the same language uses a ton of brackets to do other things.
But the reason I’m asking to use more than just plain sorted
is because plain sorted
gets an incorrect result of ABCabc
where a correct result would be AaBbCc
: so it’s a matter of getting the correct result, not of efficiency. But if you really want to see it implemented as a key function instead of double sorting then I can go looking for the correct key function.
I’ve done a little bit of work elsewhere on sorting long lists and different sort keys on different fields can be very useful. Easy sorting by date time, characters.,both case and case insensitive, lengths, file formats, meta data, engines and meta data in both directions all might be relevant under different scenarios. Making the selection of these in an easily understood fashion that adheres to the basic gui functionality can make for a flexible and effective interface because most search functions require rather a lot of screen real estate and become ridiculously complicated very quickly.
Yeah I actually asked about this on the Python Discord today and based on some of the discussions on there, it looks like the best course might just be to use casefold
as the key function and call that good enough.
Actually I think sorting the numbers is probably more important than lower/upper case. If you have multiport device like the MidiKlick with more than 9 ports then you have a list:
- MidiKlik 4x MIDI 1
- MidiKlik 4x MIDI 10
- MidiKlik 4x MIDI 11
- MidiKlik 4x MIDI 12
- MidiKlik 4x MIDI 2
- MidiKlik 4x MIDI 3
…
This is very wrong! There is a solution on stackoverflow which uses regex which looks good.
Yeah that is doing the thing I had suggested about treating groups of digits as numbers.
I think that’s good but to combine it with the case insensitivity, where it resorts to just returning text
then it should actually return a tuple like (str.casefold(text), text)
. What that does is have it first sort case insensitive but then resort to sorting by case if there’s a tie, making the result deterministic.
No need! The text is changed to lowercase. The sort is done with the regex to handle numbers and the returned sorted values have the index of the port. We then use the index of the port to recreate the user friendly name with its case restored. It works!
I guess a situation where names differ only in case is pretty rare. Good to know we got something reasonably good for this.
This one got rejected but I’m hoping that the underlying issue could still potentially be addressed some other way.
Hi @BenMcLean ,
I have been trying to figure out a solution to your problem for a long time. I think I finally figured it out. When I break down your gear from this post:
so I would suggest this solution:
Side of controllers:
Alesis Vortex Wireless 2 over USB key
Side of synthetisers
Yamaha Reface CP - over DIN MIDI or USB
Yamaha Reface CS - over DIN MIDI or USB
Yamaha Reface DX - over DIN MIDI or USB
Yamaha Reface YC - over DIN MIDI or USB
Zynthian - over DIN MIDI (yes as synthetiser, not as midi processor)
And a Black Magic Box that connects the controller with synthesizers - midi processor and hub
The following products are available that work standalone without a computer.
-
BomeBox - this need a usb hub for connection of yamaha reface units and Alesis Vortex Wireless 2 and Zynthian can be pluged by DIN MIDI. With with built-in MIDI Translator Pro engine you can send sysex to yamahas.
-
ESI M4U eX - I’m not sure how it handles sysex here. But it is possible coonect all your equipment. I have no experience with how they set up routing here.
-
iConnectivity mioXM - this has rtp-midi over net.
-
Or build something of your own. Here I would combine RPI 4 and Midihub from Blokas and let it run on PatchBox Os.
I have tried options 1 and 4
Thanks for the info! Definitely some interesting possibilities in there. However, since I already spent the money to get a Zynthian v5 kit and have it assembled, I’d be more inclined to either wait for the Zynthian OS to catch up to this use case or to try a different operating system on the Zynthian v5 hardware I already have than to purchase a new box.
I’ve mentioned before that I already have a Raspberry Pi Zero setup to just act as a MIDI host/router, which I explain in more detail in this post: Before I throw my Zynthian out of the window - #27 by BenMcLean
What I was really hoping for the Zynthian to eventually do for me – beyond the signal routing that the Raspberry Pi Zero can do – is to save and load presets for the Reface keyboards since, while they all accept MIDI presets, most of them don’t have any onboard memory and the DX’s onboard memory is very limited.
Before I knew the Zynthian v5 kit existed, I was doing a ton of research online to try to figure out how I could connect a touchscreen to a Raspberry PI Zero to make this simple MIDI host/router into a preset selection machine as well. But when I saw that the Zynthian v5 kit existed, I realized that this was probably the best way to get a small device with a touchscreen that I could potentially hack if I didn’t like the official OS for it. So if waiting for the Zynthian v5 to improve (and participating with testing and maybe even some development myself) doesn’t work out, I’d be more inclined to just swap out the Zynthian OS for a different OS that does what I need rather than to invest in more hardware which might also not do what I need. But even that would be a last resort in the case that the Zynthian OS doesn’t eventually work out.
The hardest thing to really get started on development is getting that dev environment initially set up, including figuring out a way to do initial testing on your changes and iterate on them quickly. For me, that usually takes way, way more time and effort than any actual coding. In my experience, coding is the easy part: while all the other non-coding aspects of even getting started are the hard part.
I already know Git pretty well and I know how to make a newprogram.py
file and start writing Python code with Visual Studio Code to make little standalone Python scripts but I haven’t worked with a larger scale Python project like Zynthian OS before: especially not one which involves so many different repositories. Most stuff I’ve worked on is either Java or C# and only one repository.
Would it be possible for someone to maybe do a live screen share with me to help me out on how to get started with Zynthian development? God only knows how long it would take to get through it without at least some live hand-holding because the back-and-forth over forums takes ages.
I know how to fork the repos on Github, how to clone them to my local storage and how to open their root folders in Visual Studio Code … then what? What would be the procedure to actually build it and test it? That’s what I’m hoping someone could walk me through personally.
I guess another question is: Can I use my Windows laptop for this or would it be easier to use a dedicated Linux computer? Or is Visual Studio Code actually the wrong IDE to be using and I should instead be using PyCharm or IDLE or is this only for emacs zealots or what?
I’m busy right now to give a full answer but I use vscode on a Chromebook too develop, connecting it to the Zynthian via SSH. I run the python debugger this way. You can do the same with Windows I think. Maybe we could talk about it at Zynth club.
This is a very simple solution. I understand correctly that Alesis Vortex Wireless is connected via USB key and all yamaha refaces as well?
I would extend it in software with mididings. If the Alesis Vortex Wireless 2 can send ProgramChanges, it can be configured to switch to the synthesizers of your choice and also sending the necessary sysex message. Somewhere I have notes on how to install all the dependencies, it’s a bit of a pain.
And for automatic connecting of midi ports I would install amidireminder.
Well, in the end, Zynthian can be also connected, for example by the ESI USB MIDI cable convertor.
I’m guessing you have your Chromebook hacked to run your own Linux? Because my Chromebook still has the x86 Android Google gave it and I don’t think that runs vs code?
Don’t know what Zynth Club is. Maybe because it has the same first rule as Fight Club?
That is correct. I don’t currently use MIDI DIN cables for anything: it’s all USB.
Definitely some interesting software for me to investigate! I wish I knew about these a year ago! Thanks!
The Vortex can be configured to send Program Change signals but not custom sysex messages – or at least not by the official software for preset creation provided by Alesis. I would really like to get into hacking the Vortex presets with my own code because the official Alesis preset creation software is extremely awkward to use (bad GUI) but I’m not sure how to do it.
one from source of examples for mididings
From generators you can use the SysEx(sysex )
You can send ProgramChange by Vortex , in mididings set scene to correct number of ProgramChange and generate SysEx messages and other things.
Nope! I run the Linux subsystem that(later) Chromebooks have. Mine is arm based but that doesn’t matter.
I couldn’t possibly comment but you may find clues in the forum.