I want to create a simple stage UI for Zynthian/MOD. After much tests I now will try to use a simple HDMI display (with touch). But I currently don’t know what may be the easiest/best way to create a simple UI. The best way for me would be to use something for Perl/bash/Python which can use the framebuffer directly (without X11).
Get the osc interface done… finally
Then we can build a tornado based web-ui that connects via Websockets.
Using an iPad or smartphone.
No need for an additional screen
If there is an accessible OSC API https://github.com/jean-emmanuel/open-stage-control is a great tool for building quick simple interfaces to interact with some synths/effects/etc. Open Stage Control may be more heavyweight than what you were looking for, but it’s a trade-off between time to make the interface and hardware needed to use the interface.
Hmmm… yes - will try this but I think it this will not be the solution I am searching for…
On stage I don’t want to rely on a smartphone with bluetooth or WLAN to drive my Zynthian I think a touchscreen with a simple interface should be the best… that’s what I am looking for.
I “finally” read something about OSC.
a) It is and always will be stateless.
This means, forget that I said Websocket…REST would be enough
b) It doesn’t have a “query-system”. Whatever they mean, but I don’t see a return mechanism.
Hence no option for us to query the list of all snapshots etc.
Rgd. touchscreen vs iPad. Valid point. But either way. A web-enabled application should be the UI.
If you want to hook a second screen, we could run a local browser on this second screen. The Raspi could handle this?
Now I see a REST interface, which handles input the OSC-way in addition to a query part.
We would create our own tornado/python web page which gives us all zynthian-ui can do and more.
And in addition a browser, who can run this new webpage or the webconf and routes to the HDMI-DISPLAY.
Is that possible?
And we need to wiki page where we can start the specification for the REST/OSC interface.
Hmmm… here are my experiences with web-browsers in addidtion to MOD-UI on a Raspi. I tried to avoid to talk about this, but perhaps my faults and problems can help others…
self created and optimized SD card image with lots of LV2 plugins and MODHOST/MOD-UI
The Raspi3 with the PiSound worked (headless).
Adding the HDMI screen and an autostarted browser (chromium-browser) seemed to work but sound creation stopped (perhaps trouble with jack2?). Same with midori as browser. Also CPU isolation (2 rt/2 nrt) won’t help.
Took a 2nd Raspi and created a special SD image which only boots a chromium-browser (or midori) in kiosk mode. I linked the ethernet interfaces of the two Raspis together (back2back, used 10.0.0.0/8 network as secondary address). Now it works - but very slow. Not really usable - but sound generation ist stable. With chromium-browser creating pedalboards in MOD-UI makes after some time problems when disconnecting plugin-links. MOD-UI stops to work. With midori this does not happen, but midori is very slow. I also tested this with a Hifiberry instead of PiSound - same results.
So:
Browser on a Raspi AND sound generation does not work (for me).
Browser on a 2nd Raspi works but is slow and has problems with MOD-UI.
My next try is to use a UI which uses the framebuffer directly. After some searching I found PGU which looks promising after some tests. Perhaps sometimes a combination of jack, mod-host, (mod-ui), PedalPi and PGU is a solution for a stage ready instrument (mod-ui is only started to create and configure setups).