Following that procedure sort of worked (buttons appears, and the are clickable, but the just don’t trigger any action).
However, according to this Riban’s post I was expecting this to work by enabling the touch interface in the ui options without having to follow pavel’s procedure.
Update and try it. It is much better and you should be able to do everything in zynthian purely with touchscreen if you enable touch navigation in webconf: INTERFACE->UI Options.
So is my assumption that the new touch ui is now in oram wrong?
Could it be that checking “enable touch navigation” in the ui options didn’t correctly add
I think the export ZYNTHIAN_TOUCH_KEYPAD=“V5” is entered in the configuration file when enabeling the “enable touch navigation”
Initially I could not use the touch sensitivity but then I noticed that my display was different and inverted/rotated… After correcting that, it worked…
I do use the pavel’s repository… for the UI.
Did not check whether it works with the normal/default repository…
If you want I can send you my zynthian_envars.sh, but I just added manually ‘export ZYNTHIAN_TOUCH_KEYPAD=“V5”’
Hope you get it working! It is GREAT… ow… do not want to tease you…
You will get it to work! I did also… lots of fumbling in the dark…
I’v been lurking around for some time, and i’m very happy with all the huge improvements that came with oram (thanks to @jofemodo@riban and the community).
This touch ui is one of the latest and most appealing for fully testing out zynthian without the kit or custom hardware.
But yes please, having yours zynthian.envars.sh would be very much appreciated.
There are a couple of different threads of design here.
One of them is to present the 20 V5 buttons on the screen alongside the normal zynthian GUI and to implement the V5 workflow. This is not an official supported configuration and requires users to follow its designer’s instructions to make it work. It is a community supported configuration.
The other is full integration of touch UI. This is an official zynthian configuration that adds full navigation using just touch, without the need for any buttons or encoders. By enabling touch navigation, you can use zynthian oram purely from the touchscreen. Navigation is performed with the topbar and some other gestures and there is a row of onscreen buttons shown at the bottom of the screen, only when required. (The onscreen buttons are only needed in two views.) Selecting “Touch” option in the user documentation on the wiki shows this workflow.
In many ways, the touch navigation configuration removes the need for the V5 button emulation. You can do everything in a considered workflow manner so the touch UI becomes a first class user experience.
Hi. As @riban already explained, you (or this thread, actually) is mixing two different things together. What he speaks about is the standard capability to control Zynthian just by touching its standard UI interface, without any physical buttons or encoders. There is also the option which makes additional four touch buttons to appear on some screens.
My experimental implementation shrinks the whole standard UI (which can still be used with all its standard capabilities) and adds 20 touch buttons around it, emulating the 20 physical buttons of the V5. To actually make the 20 touch buttons work, you also have to enable their V5 actions as I explained in the second post.
All of this should be achievable much easier, of course, but the zynthian guys first need to decide how (and whether at all) this should be integrated into the UI properly. The UI should probably be better adapted for such concept if this approach were supported more generally in the future.
No, there is one option, to enable touch navigation. This changes how the screen reacts to touch, giving more navigation options and only where needed adds a button bar to the bottom of the screen. The button bar is only shown in two screens, not all the time.
To enable @wanthalf’s alternative, 20 on-screen buttons interface (V5 emulation) requires you follow their instructions at the start of this thread. Since this thread started, there has been significant enhancement to the standard touch UI (with touch navigation option enabled) so you may find the wonderful option offered by @wanthalf to be unnecessary. Basically, I wonder whether the core code enhancement supercedes the need for the V5 emulation?
I think it is a matter of preference. Personally I do love the additional buttons that Wanthalf’s adaption makes possible
I really love how flexible the Zynthian application is!!
May be not necessary, but still useful? (Or do you dare to tell @jofemodo his lovely buttons are “unnecessary”? ;)) Personally, I also like buttons, and physical ones are much better than touch buttons, of course!
Actually, I was quite surprised how well Zynthian could be controlled just by the touchscreen before you did all the latest improvements (I couldn’t have tested yet, unfortunately)!
Maybe, the larger displays could be used for even more than just the static buttons I have tried to add? The current UI is designed for rather small displays up to 5 inches. Perfect for a display embedded in a small device. But it looks a bit clumsy when shown on larger displays - even just the 7" I have. (However, I don’t mean one should design a full-fledged desktop environment just because of that, of course! :D)
Indeed! If people find stuff useful then it is… erm… useful to some people! I guess I am questioning what the official core code does. The UI is very carefully considered and much (robust) debate occurs behind the scenes. We avoid adding functionality unless it has significant benefit that outweighs any increase in code complexity and support effort. I love the idea of the V5 buttons on-screen and think what you have done is fantastic but, if we are designing for larger (than 5") screens then we want to take a measured approach to identify the problems we are solving or the features we are adding. The main concern I have with on-screen touch buttons is the lack of physical registration, i.e. you have to be looking at the thing to control it (which is of course true of any touch UI) whereas the V5 physical buttons alllow user to use blind by locating the registration nipple and then using muscle memory to locate other keys.
Touch UI has been a second (or third) class citizen from the start. I think the touchscreen on the V3/V4 was more a case that “we can” than “we should” and has not had the attention it needs until now. I feel that oram will provide a fully functional touch-only UI for the first time. (I would say that - I have coded a lot of it!) Do try it and feed back.
Just for curiosity (unless it’s a secret): how many people are actually involved in the (core) development of Zynthian? First I perceived it as a pure community project discussed just here, but you often speak about things “behind the scene”. And after a while I can see that the most important things are not discussed here. In this discourse it is not easy to identify the “core team members” (but it would be useful for newcomers). Fernando is obvious, you too (after a while). Me and Chris and most of the others are obviously “the community” :D… Are there also other core team developers involved that are not active in the discourse here?
@jofemodo is God. Below him are some significant contributors which include @mheidt (who wrote a lot of webconf) and @riban (that’s me!) who has written many modules and worked on the code a lot. I would probably describe them as the core development team although the term is used informally and changes, depending on what is being discussed, e.g. @oscaracena is a significant contributor in the ctrldev area. (I am not purposefully excluding anyone here. It would take too long to list every contributor and place them in such a hierarchy - there isn’t really one!)
This is a project run by @jofemodo with an associated business supplying hardware then everyone else forms the community who contribute in many and varied ways. Maybe I simply promote myself to a sub-deity level to make myself feel more important than I am… but I do a bit around here.
It’s all rubbish. I’ve got them all chained up in a basement doing my bidding.
The development core is @jofemodo & @riban. @jofemodo as is said, is God. It’s basically the python model, and they decide cos someone has to approve the PR’s.
The best way of interacting is via the Reports page on the webconf home page as this provides both Bug Reports and Feature requests. Please get them in, because it’s a virtuous process that benefits everybody.
Jofe’s approach is a rather fine pure interpretation of Open Source. It’s all there. His value add is ultimately the community and we aim to develop that element to the best of our limits. We welcome everyone, simply for their interest in zynthian. We are interested in what such people are interested in. So we will discuss competitors and simply talk of Music. There are moderators but in comparison to a facebook page it’s the very occasional weed and gentle tidy of threads that can and do meander.
From a highly technical group of tinkerers the user base takes on different characteristics every time a version changes. The overlap from technically literate ( yes we have many of those ) to musically adept is glorious. We have watched coders learn scales and musicians learn code.
It all contributes to what can be done with a Raspberry Pi, Jack and a bit code.
The strain at the moment is probably most directed at documentation and I would actively encourage you to grab a chunk of documentation that describes an area you haven’t used, perhaps and see what you make of it, and report your reaction via the report button, probably as feature requests rather than bugs. V4 & V5 are different beasts and in a perfect world we would document first and code second.
There is also an informal bunch of ne’er do well from all round the planet that meet up on a video chat which kicks ideas around, with some of them obviously growing wings whilst other are quickly killed and lost somewhere in @MrBroccoli 's shed. it’s informal indeed now we have @riban nicely ensconced, nicking marbles we could try one next week ?
Frankly, It’s all quite nauseating in its camaraderie really.
IMO v5 buttons are necessary and even their implementation as onscreen buttons recommended by @wanthalf are big improvement to what currently is available in oram. Buttons provide quick access to important screens that otherwise would need several touches with current Oram implementation. Also without encoders and arrow buttons it is very difficult to navigate by touching options from menu’s.
Another way of improving easily user interface is by buying usb 4x4 keyboard.
They are tenner on AliExpress and work great. Still encoders are required but they can be plugged directly to rpi pins. I tested one last week and they are great.
I know that v5 is 4x5 keyboard but that configuration is so rare to find on AliExpress. 4x6 is more available but it is almost twice the price of 4x4 which I believe is enough.
Here is the current keyboard layout that I tested with 4x4 keyboard that works great.
I also ordered keyboard plastic stickers to be printed like this to see how will they look on this black keyboard.
Alt button is not really alt button that is on v5. It is used to trigger all alternative (bottom labels) as well as to control midi recording instead of audio recording. It needs to be pressed together with another key. This is because this is a keyboard and there is no long presses that you can use as with v5 keyboard which are push buttons. I found no difference in doing so and even more intuitive. Top label is normal action, while bottom label is triggered when alt key is pressed together. Finally, arrows can be used with alt key and to have the same as f1-f4 function. Hence no need for 4x5 keyboard.
That gave me idea for another hardware that would be nice to develop: usb device with 4x5 keyboard and 4 encoders on top and adac sound card inside. Such device could be plugged to any raspberry pi with zynthian software over usb, should not be more than 100$ and would be all that most of us would need. I would like to be able to design this but I never done any adac over USB design.
The latest touch UI gives fast, usually single press access to many of the most frequently accessed performance screens, e.g. mixer, zynpad, ZS3, control. I’m not disregarding the advantage of dedicated buttons but am interested in specific feedback on how the latest touch UI works. Specific criticism with example workflow will help to shape it. I am currently quite pleased with how you can access everything quickly with touch.
Hm, clearly I don’t know how to use it. For example, when I am on mixer screen I need to bold press encoder 4 to open menu and then, using encoder or touch to select sequencer or snapshot option in the menu. I need to investigate how to do a single press.
I don’t know how to configure HID events so I recommended programmable keyboard. Without HID events where we can sense long presses we need alt or control or shift key to be able to trigger alternative key functions. I will investigate this further.