A new guitar to midi plugin for Raspberry Pi 5

Hi all,

I have created a new low latency guitar to midi plugin. This works very well with Zynthian. Requesting to try this out and provide feedback. It can be found at - GitHub - anirbanray1981/PiPitch: A LV2 plugin to convert guitar notes to midi for real time performance on Raspberry Pi 5 · GitHub .

Thanks,

Anirban

8 Likes

Hi @anirbanray1981! A warm welcome to the community and what a cool way to introduce yourself! Unfortunately, my attempt to build the plugin fails with a cmake error:

cmake -B build -DBUILD_LV2=ON -DCMAKE_BUILD_TYPE=Release
-- RTNeural -- Using Eigen backend (Default)
CMake Error at CMakeLists.txt:46 (find_library):
  Could not find LV2_ONNXRUNTIME_LIB using the following names: onnxruntime

I tried installing the onnx dev and runtime libs but with no success. I don’t find help in the (rather extensive and impressive) README.

Thanks for pointing this out @riban . I added the onnx runtime in the NeuralNote thirdparty.

Please try the following commands:

git pull --recurse-submodules
cmake -B build -DBUILD_LV2=ON -DCMAKE_BUILD_TYPE=Release
cmake --build build -j$(nproc)

The plugin can be found at build/pipitch.lv2. I have updated the README. Please let me know if you face any other issue

I build the plugin but it won’t launch within jalv. I just get an error:

error: Failed to instantiate plugin

This warning is issued several times during build:

cc1plus: warning: switch ‘-mcpu=cortex-a72’ conflicts with ‘-march=armv8.2-a+dotprod+fp16’ switch

This is on a RPi5.

Can you please try with the pipitch_tune app and see if you are able to transcribe the audio? The plugin instantiation might be due to the improper runtime library setup.

Can you share the output of the command?

ldd /zynthian/zynthian-plugins/lv2/pipitch.lv2/pipitch_impl_armv82.so

I will fix the compiler flag warning shortly. It can be ignored for now.

pipitch_tune is detecting onset and pitch.

ldd /zynthian/zynthian-plugins/lv2/pipitch.lv2/pipitch_impl_armv82.so
        linux-vdso.so.1 (0x00007ffee7048000)
        libonnxruntime.so.1 => /root/PiPitch/NeuralNote/ThirdParty/onnxruntime/lib-linux-aarch64/libonnxruntime.so.1 (0x00007ffee5dc0000)
        libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x00007ffee5b80000)
        libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x00007ffee5ad0000)
        libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x00007ffee5a90000)
        libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x00007ffee58d0000)
        libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x00007ffee58a0000)
        librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x00007ffee5870000)
        libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x00007ffee5840000)
        /lib/ld-linux-aarch64.so.1 (0x00007ffee7010000)

This is what I see on my setup:

(venv) root@zynthian:~#  LV2_PATH=/zynthian/zynthian-plugins/lv2 jalv https://github.com/anirbanray1981/PiPitch

Plugin:       https://github.com/anirbanray1981/PiPitch
JACK Name:    PiPitch
Sample rate:  48000 Hz
Block length: 64 frames
MIDI buffers: 32768 bytes
Comm buffers: 524288 bytes
Update rate:  30.0 Hz
Scale factor: 1.0
2026-04-01 22:47:36.283792547 [W:onnxruntime:SwiftF0, device_discovery.cc:325 DiscoverDevicesForPlatform] GPU device discovery failed: device_discovery.cc:92 ReadFileContents Failed to open file: “/sys/class/drm/card0/device/vendor”
NeuralNote: SwiftF0 model loaded from /zynthian/zynthian-plugins/lv2/pipitch.lv2/swift_f0_model.onnx
NeuralNote: loaded 5 ranges from /zynthian/zynthian-plugins/lv2/pipitch.lv2/pipitch_ranges.conf
PiPitch: 48000 Hz  [impl: pipitch_impl_armv82]  [ranges: 5, single worker thread]

#CTR> 2#amp_floor=0.300000
#CTR> 7#bend=0.000000
#CTR> 3#frame_threshold=0.400000
#CTR> 1#gate_floor=0.003000
#CTR> 8#max_poly=3.000000
#CTR> 4#mode=4.000000
#CTR> 5#onset_blank_ms=25.000000
#CTR> 6#provisional=0.000000
#CTR> 0#threshold=0.600000



CLI thread created successfully

How does it look on yours?

LV2_PATH=/zynthian/zynthian-plugins/lv2 jalv https://github.com/anirbanray1981/PiPitch
Plugin:       https://github.com/anirbanray1981/PiPitch
JACK Name:    PiPitch
Sample rate:  48000 Hz
Block length: 256 frames
MIDI buffers: 32768 bytes
Comm buffers: 524288 bytes
Update rate:  30.0 Hz
Scale factor: 1.0
error: Failed to instantiate plugin

Thanks for sharing the error. I fixed some build issues. Please pull from the git, delete the old build and rebuild.

That did it! I have compiled and tested that the plugin loads and does audio to MIDI conversion. I need to crash so will have a more in-depth play tomorrow.

I notice that audio input routing works but MIDI output routing does not (if the plugin is in an audio+MIDI chain) so we need to fix this bug in zynthian. If we want to replace aubionotes (which attempts a similar job) then we will want to integrate into the core autoconnect code. I will have to assess how good this is and whether it is worth integrating… but for now, thanks for the plugin.

3 Likes

Yes, for now I was hooking up the midi output from this login to either the midi router or ZynAddSubFx’s midi input directly using jack_lsp.

Hi @anirbanray1981! This is a cool plugin. It does seem to work better than aubionotes, particularly reaching lower registers. (Aubio cuts out around the low G on a standard guitar tuning.) The onset / detection / trigger latency is noticable in the lower registers but becomes much better from about half way up the range of a standard tuning guitar. Input level needs to be carefully adjusted. The plugin may benefit from a monitoring output that shows the detected input level and maybe some other parameters.

I need to read all the README when I have some more time. I played with the controls (money testing) today and could not improve detection. There is a “Pitchbend” parameter that I am yet to get to do anything.

Picking has to be very deliberate, percusive and fairly slow but I can see this providing some users with an alternative input method. (FYI I have a split pickup MIDI guitar so am less likely to use this plugin.)

If we decide to replace aubionotes, then this would be run as a MIDI input rather than a chain plugin, although we could offer both.

There are audio outputs. What are these used for? (In zynthian we can bypass the effect which stops MIDI output and passes audio along the chain which some may find useful.)

1 Like

While patiently observing the development of that case I wonder from reading the readme if this plugin would offer polyphonic audio-midi conversion, because it says it takes monophonic sources but has polyphonic modes.

It does do polyphony. It uses some clever neural logic to detect the incoming signals and trigger seperate MIDI notes. It isn’t perfect…

1 Like

Looking forward to this. Maybe I can soon delete my ebay search task for Eventide H90 which I wouldn’t dare to buy anyway.

1 Like

Have you tried one of the harmonizer audio plugins in zynthian which work purely in audio, not MIDI?

Not yet, and I really should. I also will check soon if we have the MOD devices Gsynth plugin available which seems to output pitch CV data. But the reason I’m hoping for this would be the H90’s excellent polysynth algorithm which would be best immitated by outputting proper polyphonic audio to midi data to a synth chain I guess.

By design, to pass MIDI from an audio-to-MIDI processor on to a MIDI chain, the a2m processor must be the last processor in the chain. This means that it must be post-fader (because the audio mixer is a processor). This means that to get an a2m processor to work you must:

  • Add a MIDI + Audio chain.
  • Add the audio to MIDI processor to this chain.
  • Move the a2m processor to the end of the chain.
  • Click the “MIDI Output” box in chain manager and send MIDI from to the target chains.

I can see the logic to why we coded it this way (you want to send from a processor to another chain it should be at the end of the chain) but in service, I am less sure it is the right workflow. I may review this.

Anyway, there is a way to add this plugin into a chain and for it to fully work without any faffing around with manual routing (outside the UI).

[ Edit] I saw there was a bug in the existing code so I fixed that and removed the limitiation so that now an audio-to-midi processor may be anywhere in the audio chain. I have amended the process listed above.

[Edit] I have simplified the process:

  • Add audio to MIDI processor to audio chain.
  • In chain manager, click the “MIDI Input” box for a MIDI or Instrument chain.
  • Select the MIDI input from the Audio to MIDI section of the list.

    Any audio processors with a MIDI output should appear in this list.
2 Likes

Thank you. Will this be included in the next build?

Regarding the other questions:

There are audio outputs. What are these used for?

The audio output is off from this plugin. I can take that out.

The Goertzel Mono and Poly modes are the preferred ones for mono and poly, but only applies to RPI5. The bend only works for Swift Mono mode. Other modes can be ignored, they were organically developed. I can remove them from the plugin.ttl.