After some site upgrades it's become much easier for me to write posts. The site has better post management (for example, I can now store drafts on-site), and a better editor that uses markdown, rather than what I had before (which was a WYSIWYG editor embedded in flask admin). I've learned a lot building this site and it continues to be surprisingly adaptive to my needs
Anyway, recently I did two shows. The first show was a new spin on material from the March show, and the second show was all new. But both debuted an improved live set up that I dedicated (and hope to continue to dedicate) a lot of time developing. I want to once again share details about how that is all going and what's new. I've called the system "emsys" but I don't know if that will stick long term!
While I consider "emsys" more of a culmination of all the equipment I'm using put together (it is, after all, em's—that is to say, my—system, in its totality; and all the pieces matter), the most "custom" part of it, and the part I'm going to talk about most here, is the software I have developed for my Raspberry Pi 4 using mostly Pure Data and Python. So how did I end up here after my gig in March?
Desync: The heir to the MIDI throne
I started by reviewing two issues - the first was MIDI clock desync, which occurs for all manner of reasons including but not limited to MIDI bandwidth, tempo fluctuations, and the fact that time passes (yeah that bad). MIDI clock desync had occurred at the end of the March set, so something needed to be done.
The second issue was more of a limitation, and that is "pattern" desync between the Monomachine & Machinedrum (+MCL). To clarify, this is when the two synths have patterns of different loop lengths, and could not come back into sync unless the transport was restarted (pressing stop and then play). This is because the Monomachine has no way of interrupting a currently playing pattern; it can only queue. Naturally you could just avoid ever having the Monomachine's loop length be indivisible by MCL's, and you would never run into this - but this is a significant limitation for me as someone who loves polyrythm.
Fortunately, the solution I eventually came up with resolved both of these issues at once, but I'll take you through my journey getting there.
So, let's talk about MIDI clocks... Originally I had planned to continue using the Machinedrum's internal clock to run everything as it had done in March, only with my growing Pure Data patch in the mix. I implemented a system in the Pd patch that would keep track of MIDI latency from the clock source, and the time between beats, so that it could predictively stop and start transport - essentially resyncing the synths on the fly in a way that sounds fluid. (This is what I've previously called "forced interrupts" or "FIs" in other posts, but I now call it "transport interrupts" / "tins").
I made it work, but it was a total hacky mess and not easy to expand on. I got that feeling; the one where you're blindly developing a system, and you look back at it, and it starts to feel a bit ridiculous. That's when I knew I needed to switch course and reorient myself in a different direction for the sake of establishing good foundations for long term expansion
So, I started thinking of ways I could use an internal clock running from my Pi instead. That way, Pd would know when to restart transport since there was no latency concerns (since both Pd and the clock would be running on the same system). I had actually tried this before, but I couldn't work out how to make it stable and easy to manipulate externally. This was mainly thanks to two things: (1) the USB protocol introducing jitter to the timing of MIDI signals, and (2) being unable to find a piece of software that could run a clock and let you adjust its tempo interoperably (like from a Pd patch).
For the first issue, I did some research and ended up acquiring the pisound HAT, which gives the Pi native DIN MIDI ports through which I could send clock data without USB jitter. (After testing it turned out not quite as precise as the Machinedrum's, but good enough for 99% of purposes).
Since I was now using Pisound, I changed the Pi's OS to patchbox, the OS created for Pisound, which is based on Debian 12. It offers an easy way to install a realtime kernel for further minimising latency, as well as support for Pisound's special signature button which I later configured to perform basic actions like restarting the Pd patch, rebooting, and shutting down the system.
For the second issue (finding MIDI clock software with external controls), I tried lots of different solutions. I first tried wielding JACK's transport with jack_midi_clock, then seq24, and seq64 / seq66 - but all were more complicated than what I needed, and didn't seem to allow externally controlled BPM. In the end, I found a project on github which very simply generated a MIDI clock from Python with mido & rtmidi. I knew Python! It's what I made this site in (mostly). So I examined how that worked, and then I frankensteined it for my purposes; stripped away the UI, gave it virtual ALSA MIDI I/O connections, and a way to switch it on/off & have its tempo adjusted interoperably through those virtual connections (which another Python script maintains). I also adjusted the sleep interval between ticks to reduce CPU load to around 18-20%, which ought to leave enough room for whatever I have Pd get up to in the future (I hope ).
The Python script that maintains virtual MIDI connections, as I mentioned, is another aspect of this system. It continuously matches ports from aconnect -l
and establishes needed connections when they appear. This means there is no problem if any particular device fails or is disconnected, as it will automatically hook it back in when available. These needed connections would change over time to meet the demands of my growing Pure Data patch, and all of these scripts were given systemctl services so that they would run on boot too.
Segment data & msets: The nuts & bolts
The patch itself is currently designed to take care of the scheduling of material through what's essentially a more sophisticated version of "song mode" (p59) present on the Elektrons. Individual "segments" group several pieces of information together about any one point in a set. Most principally, Elektron bank IDs (p45) (e.g. A09, C11, H02...) for both the MD (via MCL) & the MnM, tempo information (bpm, ramp time), loop length, etc.
All of these segments are stored in .mset
files, which can be switched out. There is also a function to calculate the estimated duration of the mset you're working on, in order to help with pacing.
When designing sets, the patch keeps track of which Elektron banks aren't in use by a segment. This coincides with an "insert segment" function which creates a new segment that points to the first available unused banks. In reality the Elektrons may have content at those banks, but in my system, it regards those as blank and we can overwrite them. In other words, when we delete a segment, it's treated blank even if the content hasn't been removed from the synths - sort of like when a hard drive deletes files, it just labels them as disposable rather than physically removing them
This way of doing things drastically simplifies the cumbersome process of copying and pasting around patterns to different banks on the Elektrons in order to consolidate material. Better yet, it abstracts away the idea of there being any sense to the order of banks altogether, and ensures full use of the MnM & MD's total available pattern & kit space.
The goal with all this is so that there is as little overhead as possible while designing material, and performing material. In the studio I can develop material without counting bank numbers, and while performing, I can just focus on manipulating the material currently playing, and move on when I want to (with the hold function).
In future I may group segments into what I may call mtracks and mbridges, which would allow managing material in a more modular way - scheduling a series of segments of material, and bridging from one to any other.
As for how the system moves from one segment to another, there is a big cluster of checks—developed with 5% intuition and 95% trial and error —that decide when to do and when not to do things. The incoming MIDI clock is broken down into a series of [counter] objects that keep track of beat divisions, and messages are sent at different times according to the beat divisions to schedule or queue changes. For new seg data, MIDI program messages (which trigger bank changes), etc., this is done a few beats before the next phrase — this is the "load bang". For transport interrupts, the stop signal is sent 1/32nd note before the start of the next phrase, and is disallowed when a program message is queued for MCL - as this causes MCL to load MD data wrong (one of many checks in place to keep this all working as best I can).
All the more intricate, musical, sound design & performance information is managed by MCL and the Elektron synths directly. How they work as systems are well documented elsewhere, such as the Machinedrum, Monomachine & MCL manuals, and you'll see there is plenty that can be done (and is done) to twist ideas upside-down, or add and remove things on the fly. I could go into the what and how, but that gets into my creative process which isn't the focus for now
I will say I have some ideas that build off of those systems that I plan to try implementing into emsys. For example, having certain pertinent parameters slowly drift around randomly, particularly MnM parameters, since it needs more love... MIDI bandwidth permitting.
The face of emsys: a reverse engineered MiniLab 3
To manage these features and parameters we need good UI. At first I tried having Pd UI elements on a touch screen connected to the Pi. I took this to its conclusion by designing an interface and testing it all out, but in the end, having to interface with UI elements on a touch screen was too fiddly, and the screen had exposed internals that would've needed housing
I shifted gears. I thought back to university, when I was designing df-s2 - my first time trying to design a live music program, back then in Max/MSP. For that I used a Novation SL Zero to control the patch. The SL Zero has two LCD screens on it which can be written to with sysex messages, so I used those sysex messages in my patch to write UI feedback on the screens.
Fast forward to this year, I recently bought an Arturia MiniLab 3 for my studio. I was already using this controller in the March set to control MCL perf macros (p77), so in trying to expand its usage beyond a glorified slab of knobs, I did some research about what else I could do with it.
What I found was that a musician, Janiczek, had very graciously posted a Github Gist on the MiniLab 3, which detailed reverse engineered sysex to take control of its small OLED screen. The sysex data was gleaned from Arturia's Bitwig controller script.
Compared to the SL Zero's LCD screens, one advantage right off the bat with an OLED screen is that it updates information much faster and without leaving behind a trail of artifacts. This excited me so I got to work and rushed in some UI feedback from my Pd patch via sysex messages.
For emsys 0.1, which was used in both of the October sets, I used the ML3 to monitor the current segment ID, current phrase, beat, tempo, whether I had scheduled transport interrupts, whether I was holding on the current part of the set, among other things. I could also edit all this information with the ML3's knobs and sliders, and copy, paste, delete and insert segment data to structure my sets.
I have improved this dramatically since October with emsys 0.2. There is now a modular screen & line manager system. Now I am able to quickly put together menus, temporary alert messages, parameter information, etc. with a syntax I came up with for structuring information on the ML3's two available rows of text.
What I mean by "modular" in this case is that all screen information, for all screens, is stored and updated at once, but we simply select which to view at any given time, and we can add additional screens, or provide additional information, by simply adding another screen manager module.
I even made a startup animation — a blinking face. It feels like my own completely one of a kind device, and that feels really special
For managing the controls, this was fairly straightforward through CC messages. The Shift button, rightmost pad, and—funnily enough—the sustain pedal, act as function keys to modify the behaviour of other knobs and buttons. The pads do most of the performance functions, like triggering tins, holds, managing transport, etc. Whereas the knobs (currently) are used for editing segments. The only exception is the leftmost knob, which controls the tempo of my clock, and queues / switches segments when pushed in & turned.
There are undoubtedly things I have missed in my post... But that's okay. See you next time