This is a production log for the track Sunset over Susa. I’m labeling this track as “experimental” as I was trying a lot of different techniques and new tools on it. This was my first project using the Cubasis DAW on my new iPad Air 2. Along the way, I got to use several other apps I’ve been unable to work into my process previously.
The original idea for the composition came from playing with the Fugue Machine app.
It is fascinating how complicated themes can come from so few notes in this app. The secret is with the different parts and how they can interact with each to produce unexpected patterns, chords, and rhythms. You select a “diatonic” scale and enter a “melody” in the piano roll. Then up to four lines can be run in different speeds, directions, octaves, etc. For this track, I’m using three parts, a fast “percussion” part, a slower melody, and an even slower bass line, all using the phrygian dominant scale giving it a very “eastern” vibe.
Another really cool thing with Fugue Machine are the performance controls. You can choose the starting step or root of the pattern from an octave below to an octave above the base root of the scale. This doesn’t change the key of the piece as the pattern is moved “diatonically” to keep the notes in the selected scale. (It is more complicated to write than to hear.) The other main performance control is that a portion of the overall pattern can be selected for the loop. For this piece, I had a one bar loop for the opening, expanded to two bars in the middle, then went back to one bar at the end. Since both these controls are “live”, it introduces a human element to an otherwise very digital process. My timing in performing the various moves was not exact so if you listen closely, maybe you can hear where the step moves or the loop moves weren’t right on the bar and a beat or two gets missed. I didn’t “fix this in the mix”.
Playing with Fugue Machine is fun, but the built-in sounds aren’t all that great, so how to make use of it? Enter Cubasis.
The first step was to setup a MIDI connection from Fugue Machine to Cubasis. Each part in FM was output to a different MIDI channel and routed to separate tracks in Cubasis. Arm all three tracks in Cubasis, hit record, switch to FM, play it, switch back to Cubasis and hit stop. Repeat until you get a take you like. 😉
I then used the piano roll editor in Cubasis to trim the clips and quantize the notes onto the grid. Next I used the internal instruments in Cubasis to start working up the rest of the composition choosing a marimba patch in Microsonic and bass and brass sounds in Micrologue. I got these sounding fairly decent and turned to fleshing out the rest of the virtual ensemble.
Drums: I selected several drum and percussion loops from the Cubasis library and used the new time stretch feature to get them to match the 120bpm project tempo instead of their native speed of 100-130bpm. I couldn’t detect any degradation in sound at this level of stretch. As noted above, there were a couple of beats missed in the arrangement, so a few loops had to be split and stretched separately to fit, but this was quite easy and worked well.
Ukulele: For the ukulele part, I first used the chord pads on a MIDI track in Cubasis just to find some chords to fit the MIDI tracks. Some strange stuff considering the phrygian dominant scale. Lots of augmented and diminished but also more normal minor, major, and dominant 7th on some bars.
The ukulele is the Imua iET tenor which is tuned to Bb reentrant. Getting the sound into Cubasis was a little convoluted since I didn’t yet have a lightning interface for the iPad Air 2. Instead, I recorded into my iPhone using the Apogee Jam interface into the Multitrack DAW app. I hit record on the iPhone, play on the iPad, and recorded short snippets. Next I copied these clips up to Dropbox from the iPhone. Over to the iPad, the files got imported from Dropbox into Audioshare and copied to the “general pasteboard”. Switching to Cubasis, I imported from pasteboard, renamed the “sample”, and double tapped to get it on an audio track, then move, trim, and arrange. Also, with the Bb uke, I sometimes had to transpose the clip in Cubasis up two semitones; other times I transposed myself on the uke before recording. Easy peasy – not! 😀 For the “stairstep” section, I only recorded C5 and Edim chords and transposed the clips further and further up. The C sequence is C5 C#5 Edim F5 Gdim G#dim A#dim C5 (actually should be G#aug but I just let it go – can you tell which beat is “off”? 🙂 )
Bass: I wasn’t completely satisfied with the sound of the Micrologue bass patch so I replaced it with an Inter-App Audio link to the Korg iM1 app.
This was a fairly stock fretless bass patch that I just tweaked a little, mainly to give a bit more release so fill in holes in the MIDI track a little bit. It did have to be EQ’ed and compressed pretty severely to keep from eating the mix though.
“Marimba”: I also wasn’t completely satisfied with the melodic percussion sound coming from the Microsonic patch. Wait, isn’t there an app designed for melodic percussion? Why yes there is: Mersenne.
Also an IAA synth, this plugged right in to Cubasis fed from the MIDI track from Fugue Machine.
Once I had both the IAA tracks like I wanted them, I froze them so as not to lose them and to save CPU cycles (although that wasn’t an issue with this project at all).
EWI: That left just one of the three Fugue Machine tracks untouched. I had been using a brass Micrologue patch but I decided to replace it with an “organic” track played on the EWI. The first challenge here was getting the notes out of the piano roll view into standard notation so I could play the part. This was another convoluted workflow as I couldn’t find a way to export a MIDI clip from Cubasis. (Later I discovered there may be an option under mixdown to do this.) What I came up with this: set up a MIDI out connection from Cubasis into NanoStudio, played from Cubasis and recorded in NS, saved to .mid file and mailed. In email, did “Open In” on the attachment into Progression:
For recording the EWI part, I thought about trying to do it directly in Cubasis, but again, I didn’t have a direct audio input anyway and didn’t feel like repeating the uke workflow. So I turned to more familiar tools. I used the VL70-m hardware synth (with Patchman Turbo chip – patch NZ Flute) feeding Propellerhead Reason on the desktop – so much easier to do retakes and comps in Reason than iPad. I didn’t try to retain every note from Fugue Machine and certainly not the strict timing. And I couldn’t resist throwing in a few improvised bits. (If you listen very closely, the improvised parts are panned slightly right of the main melody.) No effects were done in Reason. The comped take was exported to Dropbox, imported to Cubasis and mixed in.
And that’s it for the tracks. On to mixing and mastering. For the mix, I used only the effects provided within Cubasis but I made fairly liberal use of them including the channel strip, studio EQ, chorus, amp sim, and overdrive for insert effects as well as reverb and delay sends. For the global mix bus inserts, I just applied a bit of EQ and “low and slow” compressor “glue” (and brickwall limiter as a precaution) but left the main “mastering” work for the Final Touch app.
Treatment in Final Touch was very minimal. It didn’t seem to need any more EQ or bus compression, so it was mainly adding some stereo field “fairy dust” and the final limiting level raise and limiting – which I tried not to make too heavy handed.
And there you have it. Another “masterpiece” in the can. 😉
But back to what started this journey – Cubasis. I’ve barely scratched the surface. I didn’t use any IAA effects, AU instruments effects, Audiobus, the new Spin FX, and I’m sure there’s more. All the ways you can get both audio and MIDI both into and out of it, makes Cubasis the most complete DAW I’ve used on iOS. It opens up a lot of possibilities with apps that haven’t found a place in my workflow. My crayon box just got a whole lot bigger. 😀