When we build cross-platform music apps and plugins, they are mostly desktop and sometimes including iOS, but much less happens on Android. Since Android audio latency has improved a lot by 2026, we have to tackle the next problem: we are missing audio plugin formats on Android. Apple has good ecosystem, so why not designing one for Android?
But you would wonder, why can't we just simply take VST3, CLAP, or LV2 on Android? Because, it is not that simple. We have a lot of lessons learned (or, learning) from Apple AudioUnit V3, along with their efforts on Logic Pro.
Throughout this session we will explain what is tricky to achieve audio plugin functionality on Android through past accomplishments, and how to deal with it. There are many issues such as, publishing audio plugin products from diverse plugin vendors without being tied to a specific DAW, passing audio and event data between a DAW and a plugin, showing plugin GUI on a DAW, and so on. We also discuss what's missing on Android platform itself to achieve full realtime capability within our apps, not just their own framework.
There are many trends on audio plugin development such as MIDI 2.0 integration like (upcoming next-gen. JUCE AudioProcessor), CLAP-first development, AI-capability such as MCP integration. We discuss what kind of features a plugin format should and should NOT tackle, especially taking CLAP as a reference. You would also learn why JUCE cannot be a "format" here.
At last, designing a plugin format is just a milestone and not the goal. We also have to achieve a plugin "ecosystem", which is very often understood as chicken and egg problem. We would discuss this with some existing efforts.