Zynthian powered live experimental generative music

Hi! So I’m currently running the first actual test of my long-time generative project, which I decided to now actually get off the ground. You can check it out here: https://www.youtube.com/watch?v=CyBrDaXJ4Ks
I’m going to keep the stream running until something craps out ;D I’m not 100% happy with everything, especially the occasional crackles I’m getting, but more or less it’s a successful first effort. The audio side is 100% Zynthian (well, my hacked together Zynthian) fed by MIDI from a machine vision board. Hopefully it’ll sound different at different times of the day. It’s not very cheerful. Comments welcome. This isn’t my day job, so no worries if you don’t like it :smiley:

6 Likes

Now this is the sort of thing I like to see and hear.!!

It’s become my room ambience for the enforceable…

What have you used on layers in the zynth…?

Do you see Xruns when it crackles…?

Thanks! You know, it makes me surprisingly happy to hear of one person that enjoyed it :smiley: I didn’t get the chance to watch the screen for Xruns unfortunately. It’s all in Mod UI. The pedalboard is packed with instances of Dexed, chorus, granular, delay and phaser effects plus pretty obviously a reverb and some eq, compression and a limiter. Most have something getting adjusted with MIDI constantly. Every sound starts as a Dexed timpani preset.

I’m aiming for a 24-hour “piece”, but didn’t quite get there this time. Next time I’m going to adjust the visuals and some settings on the note generation side. But I’m also going to move to a headless Rpi 4 for the sound and I hope that will solve the crackling.

This is really cool. The video is odd - lots of banding. Is that a desired effect or something wrong.

I may use this as my background music whilst at work tomorrow.

Thanks! Yeah, that’s on purpose. I may have overdone the compression on the video, though.

Nice ! Can you tell us more on how you generate midi out from the video capture ?

Thanks! It’s a project I’ve been iterating on and off on different platforms for the past seven years, but I’ve never put anything up for listening before. It’s not very complicated, really. It’s actually a separate device from the feed you see, but situated as close as possible. It tracks the median shade of the image, splits it into a few datasets using a histogram, tracks their median values and also contrast and rate of change in the video. Then I fiddle around with those numbers in ways that make musical sense to me.

1 Like

Ok. Well donne
:clap:

1 Like