Building a Suzuki Omnichord Emulator (part 1)
I've been craving an Omnichord for years. I fell in love with it about fifteen years ago, back when it wasn't impossible to buy one at a reasonable price. I had several opportunities to get one, but I never did, and I really regret it. Ever since it became a "mainstream" instrument, probably thanks to Damon Albarn (there's a great video on YouTube of him explaining how he composed the Gorillaz song 'Clint Eastwood'), the prices have skyrocketed.
Not to mention the official reissue from Suzuki themselves for the modest sum of 800 Euros. A truly "popular" price, wouldn't you say?
I don't want to spend that much money on a hobby that's never earned me anything; in fact, I've always lost money on it!
As a programmer, the most logical solution is to build a software emulator. Sure, some emulators already exist, but none of them seem to cover all the instrument's capabilities. The most complete one seems to be C.ARP for iOS, but from the videos I've seen, it looks more like a modern reinterpretation than a true-to-life emulator.
So, about a month ago, I decided to start working on it. After all, why not?
I always need a side project, and this one seems particularly fun.
The Tech Stack
The first thing I had to do was choose a tech stack.Since the Omnichord is a "touch" instrument, the most logical approach is to leverage the touch features of smartphones and tablets. I don't own any Apple products and have no plans to, so my development will be focused on Android.
At my day job, I've been using Flutter, Google's framework based on Dart for building native apps for Android, iOS, Linux, and more, and I'm really happy with it.
Years ago, I tried to develop for Android natively with Kotlin, and it was a frustrating experience. To put it in programmer terms, the Android API is a complete mess. Kotlin is a clunky language full of syntactic sugar; it feels unnecessarily complex and I'm not a big fan, especially coming from a C background.
Dart, on the other hand, is much more my style. It's a modern language, similar to JavaScript in some ways, but you can use it almost like C++, if it weren't for the garbage collection memory management.
Still, with Dart (just like with JavaScript), you have to be careful with memory, for example, when you set up event listeners, which you need to remember to deallocate when they're no longer in use. But let's not get too sidetracked.
I chose Flutter because I think it has brought a lot of order to native Android development. For me, it's a real pleasure to use its widgets and all the utilities it provides. Plus, it's a very free framework in the sense that it doesn't force you into a specific project architecture; you're free to do what you want.
That kind of freedom reminds me of the feeling you get when you start a new C program with just int main() { return 0; }
The Latency Problem
The first hurdle in building a professional audio app is minimizing latency, which is the time between when the screen is touched and when the sound is played. As you can imagine, if that delay is long, the app would be unusable.I'm not a huge fan of project dependencies, but to find out what was available, I tried all the audio packages on pub.dev (the official Dart/Flutter package repository). They all had the same problem: incredibly high latency. While Flutter does compile to native code on the target platform, when it comes to audio, there's always a layer of translation from Flutter to the target environment that introduces latency, and a significant amount, from what I've seen!
I was about to give up on Flutter and switch to native Android development as a last resort when I found an interesting alternative: the Google Oboe library. It allows for the lowest possible latency on Android without having to drive yourself crazy with its APIs, which, as I mentioned, are a mess. Basically, it’s a translation layer that’s been optimized for different versions of Android, a true godsend!
The library is written in C++, and there aren't any packages that simplify its use in Flutter. Or rather, there are some, but they're mostly just examples of how to connect Flutter to Oboe, and many of them are based on older versions of Flutter and Oboe. So I decided to do it myself and get my hands a little dirty. What's the fun in a project if it's too easy?
The Solution: dart:ffi and a C++ Wrapper
So I imported the library into my project as a git submodule and started writing a connector to Oboe using dart:ffi.dart:ffi stands for foreign function interface and it's how Flutter interfaces with a C-based backend.
Since Oboe is written in C++, you can't connect it directly to dart:ffi.
There's a little trick: you write a wrapper containing functions with extern "C", which are C-compatible and call the Oboe C++ classes.
These are the functions that will then be called via dart:ffi.
So it takes several steps: Flutter -> dart:ffi -> C++ Wrapper -> Oboe.
How does this connector work? First, there's the main of Flutter, which for simplicity is currently a StatefulWidget.
In the widget's initState, I call a function defined with dart:ffi, which in turn calls a function written with extern "C". This function finally calls and instantiates the Oboe library.
The Oboe instantiation process is not much different from what's proposed in its Getting Started documentation.
I also start the AudioStreamDataCallback from Oboe, which is basically the function that runs in a loop, where the samples for Oboe's playback are fed. A sample is just a floating point number.
In addition to Oboe, for latency testing, I also instantiated an LFO sine wave oscillator (which I wrote myself). This will allow Oboe to play a test tone. The LFO is initially set to 0Hz, so it's silent.
With that done, I added a button to the Main widget. When pressed, it calls another function defined with dart:ffi, which calls another extern "C" function, which in turn calls the LFO instance and sets its rate to an audible frequency, for example, the classic 440Hz.
At this point, having instructed AudioStreamDataCallback to read the sample from the LFO (and to advance it), the 440Hz tone is played.
I was able to test all of this directly on my very old Samsung Galaxy J6 (my personal phone since 2018). The latency is extremely low; the sound plays as soon as I press the button on the screen.
Now that this problem is solved, I can finally focus on the sounds.
That will be the topic of the second part of this series.
It has been submitted and will be reviewed before being published.