Cpu meter


Since I have my zynthian I have experienced some issues, one of them is the inability to clearly know the CPU usage.

I know I can replace the stereo counter with the cpu counter, but it’s not very efficient, we don’t have a percentage to show it clearly (mod ui does this).
By the way, to know stereo and cpu meter at the same time it can be nice)

In the webconf we have a cpu temperature meter, ram, but not a cpu meter.

It can be very useful when we are using a lot of synths and effects to clearly know which layer is using the processor when we add or remove it, and to avoid xruns.



The CPU meter was displayed where the audio display is now, and I suspect the overall feeling is a preference for this,especially as the dynamics are so nicely defined.

It was useful to have the CPU meters but from a users point of view just seeing xruns is enough.
As you can imagine screen space on the GUI is limited and, as you say, you can replace the audio display.
The difficulty with the webconf is this is a much ‘slower’ interface. The temperature ( a request of mine from eons ago ) is something that really doesn’t change much but trying to produce a similarly responsive display as CPU or Audio across a web connection isn’t something that is easily done in such a fashion that doesn’t have a lag or is inaccurate ( If I’m wrong I’m sure someone will correct me) .

Probably the command line might allow you to see the measurements you want as that is not encumbered by GUI responsibilities…

Sorry if this doesn’t help too much. Others might have a more positive outlook…

I understand everything you are saying, and I am always impressed by the reactivity of this community, thank you very much for that.

But I think xruns are not enough, we need to know precisely how the processor reacts to different types of layers. Zynthian is not a big computer and it can’t play like it. And the actual cpu meter is not very understandable.
Like I say mod ui does that.
The cpu percentage could be next to the stereo meter. And yes, as you say, if I have to choose, I prefer to see how the sound reacts.

I have floated the idea of benchmarking each engine to give users an indication of what can be used together. There are complications to such a scheme but I believe it has merit so will try to implement something soon. FYI setBfree always uses all its required resources but most other engines use resources only when playing.


I have knocked together a python script that instantiates a synth, connects MIDI and audio, selects presets, plays notes and monitors jack cpu usage. It creates a CSV with the engine id, test name, minimum CPU usage for this test, index of preset that gave minimum CPU usage, maximum CPU usage for this test, index of preset that gave maximum CPU usage, average CPU usage of this test for all presets tested. The tests are:

  • Quiescent CPU usage, i.e. synth loaded but not playing any notes
  • Mono, i.e. playing a single note (C4) for 5s taking readings every 100ms until 1s after release and taking maximum value
  • Poly, i.e. playing a 5 note chord (Cmajor7) for 5s taking readings every 100ms until 1s after release and taking maximum value
  • Crescendo, i.e. playing all 88 notes of a standard keyboard (A0-C8) in fast succession without release, holding for 5s taking readings every 100ms until 1s after release and taking maximum value

I have started it running on a RPi4 with Zynthian stopped but all other supporting services running, e.g. jack, ttymidi, etc. I estimate it will take approx. 3 hours to give results for all (83) Zynthian LV2 synths testing the first 5 presets of each synth.

This script currently only checks LV2 synths so needs to be expanded to include other LV2 plugins and other engines. If this proves useful it could be extended to include all these engines.

Note: The results are the current CPU load estimated by JACK. This is a running average of the time it takes to execute a full process cycle for all clients as a percentage of the real time available per cycle determined by the buffer size and sample rate. This is the start of benchmarking which may or may not be useful… let’s see.


Wow impressive ! Let’s see !

Okay - first stab at this… Attached is the raw data in csv so you can look at it in a spreadsheet or whatever lv2_benchmark.csv (29.2 KB). The “quiescent” test is the reading before loading the engine and idle is the reading after loading engine but before playing any notes. Remember this is only the LV2 synths but some interesting points jump out from the data:

  • About a dozen synths burn CPU when idle - these may limit the quantity of concurrent engines you can have loaded
  • reMID and padthv1 can be driven to extreme CPU usage although padthv1 tends to behave if you don’t have 88 fingers (or a hold pedal)
  • Quite a lot of engines have fairly low CPU usage even at full throttle - a good selection to choose from for simultaneous playing
  • Some synths that one might expect to be CPU hungry are not, e.g. tal-noisemaker creates complex sounds but sits below 5% all the time
  • Pianoteq can be quite tame - it is okay when idle if you don’t enable effects and should be okay to use even with other carefully selected engines as long as you limit yourself to just the two hands
  • yc20 seems pretty consistent in that it uses about 34% all the time whatever you are doing with it.
  • synthv1 really depends on the patch which means some form of benchmark indication may be useful during snapshot creation to indicate how the specific patches behave together
  • @wyleu will be pleased to see that Helm (his current favourite) plays nicely sitting well below 10%. This is an example of a complex synth that is well designed.
  • It has often been said that zynaddsubfx is good value and this dataset seems to support that. Again a complex synth that nestles nicely well below 10%
  • dexed needs to be mentioned, being a favourite of so many of you. Just look at those figures - all below 3%. I will defo want to check out its stats for more patches.
  • Obxd behaves itself poking a cheeky head just over 10%. I was expecting worse figures but let’s see how she handles a deeper investigation…
  • surge of course shows its high CPU usage though the few tested presets don’t break the bank. I know from experience that surge is very variable depending on patch so I want to test that fully. Note that using program change you can access lots of presets that are not available via the UI
  • All the drums seem tame as you would expect

Remember that only the first 5 presets were tested (as selected with MIDI program change). I will run a test for a select few engines overnight testing more presets.

This testing is kinda in isolation so may not fully emulate how Zynthian manages engines. We could fold this benchmarking into Zynthian if thought beneficial but for now lets look at the data and compare it with real life experience to get a feel for how valid the tests are.

Let me know if you think this is useful and any features we could add or metrics we should be looking at. @Tabula do you think this kind of data may help address the issue you initially raised here and if so do you have any ideas of how the data could be presented to assist in that task?

I have restarted the benchmarking but this time for a limited quantity of synths but for 128 patches on each. It should be finished when I wake up in the morning…


What a good Idea ! Really informative. I’m impressed.

The good : most of the plugins aren’t that much hungry, even in “crescendo” mode.
The Bad: but some of them can rise to high loads when being stressed, while being quiet in “normal” use.

Question 1:
What’s the differents between quiescent and idle ?

Question 2:
Will the next step be benchmaking audio effects ? :wink::grin:

1 Like

Quiescent is when no plugin is loaded and should be taken as the background CPU usage of all other JACK clients (including the test client) and JACK itself. In theory we should take this away from the other test results to give us the actual usage of the plugin.
Idle is when the plugin is loaded but not doing anything, i.e. no notes are being played. This is particularly informative if the engine is effectively running it processing all the time like some emulators do, e.g. setBfree and yc20 (which uses Faust to emulate a hardware device so is basically running full processing all the time).

I think the next step will be to perform some manual correlation between the test results and real life experience with the engines and their presets. I do want to add the other engines but that will require a bit of time and effort, e.g. audio effects may need to be stimulated with some input signal so I need to figure out how best to do that - probably use a low resource generator, maybe a noise generator. It needs a few extra steps to the script to start the generator, start the plugin, route the plugin, etc. Also I am not sure how many presets there are for audio plugins so we may need to start twiddling knobs (automatically). The more variables to the tests the larger the data set and the longer each test takes. The current synth tests take 25s for each preset.

My overnight tests have completed and here are the results.lv2_benchmark.csv (3.1 KB) . My first impression of the results:

  • OBXD seems quite tame with most presets staying well below 10%. It looks like it should be able to be used with another engine
  • Calf Monosynth is low usage
  • Dexed is low usage
  • Helm is quite tame
  • Tal Noisemaker is low usage
  • zynaddsubfx is fairly low usage
  • amsynth is a bit higher than I might expect but still fairly tame
  • synthv1 is a very variable and can be a hungry beast - choose your patches carefully
  • surge can take over all resources leaving your Zynthian sounding like a crackling, cackling witch but there are many presets that are useable - choose wisely

My overall assessment is that we may be able to give a guide to users of each engine’s CPU usage range and its idle usage which may help them understand how better to use the device but It may prove more challenging to give accurate indication / warning of impending doom. I think such an indication will help. If I know that I can use Helm with other synths simultaneously or synthv1 only by itself but that I can have synthv1 loaded and idle within a snapshot without significant impact on output then I am better placed to design my performance snapshots. I do aspire to something more user friendly, i.e. not needing an engineering degree to interpret test result data.


I understand Beethoven worked in a pretty similar fashion ! :smiley:

Ok, I understand. Thank you for the explanation

@riban first, wow! thanks for your hard work, amazing job, it is very helpful!

Now i know i can use multiple dexed, it uses very low cpu.

I still think it would be interesting if we could see as a percentage the level of the processor used next to the stereo level in zynthian software, maybe in the future!
Again, I think your work is amazing, all Zynthian users must see it to better use Zynthian synths!

Maybe we can squeeze a vertical traffic light indication in the corner. Try submitting a change request in the issue tracker.

I have created a spreadsheet in Google Docs with some benchmark results. This is colour coded to show low (<5%), medium (5-10%) and high (>10%) usage. There is a table for RPi3 & RPi4 and one comparing the two. As expected the RPi3 struggles more than the RPi4. I need to tweak the benchmark script to add some different timing - I can see what looks like erroneously high quiescent CPU on RPi3 with Monosynth loaded.

This is interesting. I can see that RPi3 can comfortably handle: Monosynth, Dexed, Noisemaker and struggles a bit with zynaddsubfx when pushed. It can handle Obxd for mono and simple poly chords but long sustained notes can cause it to quiver. Surge wasn’t installed on that machine correctly so it didn’t run the test but I expect it will be grim viewing. I am now running a test for setBfree and will add the results but it looks like RPi3 sits between 30-50% and RPi4 between 16-17% which is a lot less and more consistent. These results help us understand how the Zynthian might behave under certain conditions which in turn I hope will guide us on how best to improve the user experience.

I wonder whether the benchmarking discussion should be in a different topic to avoid obscuring @Tabula’s original (and persisting) request?

If anyone can suggest a better place for the test results, a better / different way to present them, a way to automate the process of visualising the data, etc. please share. I fear the manual process I used today is not sustainable. It would be good to be able to run the benchmark regularly, updating results based on current software and hardware versions. We should probably also record the hardware and software configuration of each test, e.g. different soundcards may result in different results. These should ideally be run on a standard Zynthian (which I do not have - maybe I should buy one… if only they were in stock :wink:).


Hi Riban
thank you very much for these benchmarks.

They are some surprises in there: eg, Monosynth use less cpu on a RPI3 then on a RPI4 in mono and poly stress test !!!

=> maybe one can move this discussion in a dedicated thread like “benchmarking zynthian”

yes for sure :wink: : from the UI or from the webconf, select “benchmark your zynthian” and automaticly push results to some kind of webservice that will display that nicely. Okay, I know, it is not Christmas time yet :rofl:

Yes, absolutely. This could be helpfull to chose the good hardware for all the tinkerer interested in building their own self made Zynthian, and why not for the next official Zynthian version.

Welcome to the club !!! Anyway, I will have to wait a bit so that I can be my own Santa for christmas.