I’m thinking of building a second Zynthian but I’m having big problems to get hold of MCP23008. One thought would be to drop the touch interface away from the LCD. It would spare up a few GPIO so I could drop the MCP23008 altogether. It would make the HW simpler, but it would permanently disable the touch. The encoder based GUI is a dream to use but I’m reluctant to drop touch if it might be used later. Are there any plans to introduce touch?
Nice to hear that you want to build a second Zynthian! I would love to see the first, please. I remember that you was using an HDMI screen … Could you put some photo/video in the forum?
Currently i’ve not using the “touch” interface, but i would like to add a 2D controller feature ala “kaos pad”. I don’t know when, but i will do because it’s funny enough to spend a couple of days fighting with it
Of course, you can disable the touch without problem as it’s not a core feature. The core UI is and will be encoder driven. Anyway, the MCP23008 is easy to get. Try RS-online. If not, i can send you a few. It’s an useful part in many projects. You can use it with Arduino or any uC/uP supporting I2C …
I’ll post some pictures soon. The first Zynthian is a pile of cables & PCBs that I’m squeezing into a box It’s using a Waveshare TFT that I had lying around. My ultimate aim is to use a larger screen with HDMI, but I’m getting used to the regular GUI before I try scaling it up for a larger resolution.
I’ve got some MCP23008 on order. The DIL ones are hard to find, but SMT is easier to source & much harder to solder…
I was wondering if there has been progress on this. The X-Y pad idea is great, but I also think having touch on the main UI could be a good feature.
I’m still building my zynthian, and I have a screen + hifiberry hooked up, but no encoders yet. Hence, with touch support, it would’ve been nice to be able to play around at this stage.
Is there a design, UI sketch or otherwise on how the touch interface should respond in “normal” mode?
I’m happy to contribute code as well. I suppose one would have to come up with a scheme that emulates what the 4 encoders would do, while also being somewhat intuitive to use from what’s shown on the screen. Any thoughts?
The touchscreen is enabled by default in the “Gorgona” SD image. If you are using the “official” Adafruit PiTFT 2.8’ TouchScreen (or compatible), you should be capable of selecting options from menus, going forward. But you can’t go back and you can’t not move the controllers. Some aditional work is needed for having a fully functional Touch Interface.
If you are using a different touchscreen, then you have to load the driver and configure by hand the X11 system.
Regarding the X-Y-pad controller mode, it’s in the TODO list yet, but it’s not a very difficult task. It should be done very soon
A touch surface is quite similar to a mouse, almost the same. In the X11 system both are “pointing devices”. The difference is what you do with the “events”.
Currently only “select from a list” events are implemented in the Zynthian UI.
I am planning on using a hdmi 7 inch display. the touch interface is sent to the pi via the usb connector that powers the display.
As your software did not use the touch screen I was not going to plug the usb connector to the pi but to a usb power supply to power the display.
The touch screen by default when connected works and interfaces as a mouse. so need to understand how this works and then (I guess) modify system parameters for it to work.
You have to configure the touchscreen to work with X11. That’s all. But as i told you, Touch/mouse events are not used by Zynthian UI. Only “select from a list” is implemented. To be functional, you should implement 2 events more:
Back
Move rotary
This is the basic functionality, but there are more. Zynthian UI supports “normal”, “bold” (B) and “long” (L) clicks in every switch (1-4):
Next Channel => 1
Channel list => 1B
Back => 2
Engine => 2B
Admin => 2L
Load/Save Snapshot => 3/3B
Select/Next Control Screen => 4
Mode Select Control Screen => 4B
Power Off => 4L
If you have a big screen, the easiest way would be to add “buttoms” with the functions.
Regarding the controllers, you should enable a “capture area” for each and implement the capture logic. Currently controllers are displayed as an “arc” with a “value”, but there is code to display “bars”. It can be chaged easily. Perhaps you could use the “bar” area to capture the “mouse”.