Production Log:Swiftlet in the Sun

Composition for this one began just noodling on my Ono baritone ukulele. The chord progressions were worked out and entered into the Guitar Toolkit app for safekeeping. I went through a number of attempts at formulating a style and arrangement. I tried some ideas in ChordBot, iDS-10, and DM2 and didn’t really settle on anything except a tempo (138bpm – as that was as fast as I could get the uke to work) and vague direction (“drum n bass” although it hasn’t really turned out that way). Finally, I threw it at Band in a Box on the PC, and things started falling into place. I got the arrangement worked out and a collection of suitable styles selected. With the arrangement done, it was exported as a standard MIDI file with the bass, piano, guitar, and strings tracks (but the patches for the latter three did not necessarily match the names at all) along with drums on separate tracks.

For production, I first imported the MIDI file into Propellerhead Reason, entered a “section map” to correspond to the BIAB style changes, and selected a starting bass patch, leaving the other parts with their default ID8 patches. It was enough for a basic guide track. But then I decided, “Nah. I want to do this on the iPad so I can use some of my new toys from Black Friday.” So I started a blank project in Cubasis and recorded the acoustic ukulele parts first, using the guide track in Reason, before abandoning Reason and reimporting the MIDI file into Cubasis and going from there.

Ukulele recording – recording of the Ono baritone uke was done with the iRig Acoustic mic clipped to the sound hole. This was the main reason for using the guide track in Reason. The iRig Acoustic goes in through the headphone jack, so it is technically possible to monitor over headphones while simultaneously recording. However, if you do that, the output will very slightly feed back, or “cross talk”, into the input so it will register faintly in the recording. Now, this isn’t a huge problem if you’re working with an essentially finished piece as then it is just a form of mic bleed that audio engineers routinely have to deal with when recording with microphones. But if you’re working with a rough guide track or click track, you don’t want that junk in the final product, so it is important that no audio be coming out of the iPad while recording with the iRig Acoustic. That is a negative of the device, but the positive is that the sound is pretty darn good. The raw input sounds surprisingly natural and doesn’t need a lot of treatment. For the uke backing track, I took it a section at a time, playing/undoing/replaying until I got a take I liked for each section.

The next phase was a bit of a blur experimenting with different apps for various tracks, shuffling MIDI parts between tracks, and just playing with the different routing options in Cubasis. All that got settled here was to use iLectric for the main keyboard part and SeekBeats for the main drum work. Bass, strings, and guitar were left as stock Cubasis instruments for now to focus on the drums.

Drum “recording” – I talked about integrating drums into Cubasis in a previous post. The bottom line is that the most flexible setup is to configure the drum app as an IAA generator audio track and feed it from one or more MIDI tracks. Some other tricks here is that the MIDI clips can be overlaid in the same track, and multiple tracks can feed the same
MIDI channel. Just for organizational purposes, I made five MIDI tracks grouped as kick+snare, open+closed hi hat, ride, crash, and high+low toms. Here they are:

These could have all been a single MIDI track, but I did it this way in case I needed to pull them back apart later. I worked up the drum design this way with all the MIDI tracks feeding channel 10 and coming back into Cubasis on a single audio track from the Master Out from SeekBeats. This allowed for a rough pan and level balancing to be done in SeekBeats. But in preparation for detailed mixing later, I decided to “freeze” the drums as a set of audio tracks, divided by groups that would likely need different mix treatment. I didn’t actually use the Freeze function, though that may have worked. Instead I set up a series of audio tracks with separate outputs from SeekBeats. In this way, I was able to capture kick, snare, and crash tracks in one pass. Then I combined the hats and ride in one pass and then the two toms in another pass. Here are the captured audio tracks ready for mixing:

Note that the IAA routing has been removed at this point. The MIDI tracks are all muted and the audio tracks are no longer receiving input. This also freed up SeekBeats to be used again for a percussion section in the middle. This wasn’t split out like the main drum hits though – just kept as a “mixed in the app” section.

Bass recording – with drums and uke basically sorted, the MIDI bass track was sticking out like a sore thumb. These were mostly dance/techno styles in BIAB so it was very busy and very repetitive. I decided I could do a better job myself. I turned to the iFretless Bass app as it just sounds really good and is one of the more “playable” interfaces in iOS. See this previous post for more details on the iFretless setup. Just like the uke recording, I had to re-record each section until I was happy with it. You can see the spliced track in the screenshot above.

Other MIDI tracks – I divided the other MIDI clips from Band in a Box into several IAA instrument tracks: iElectric for a warm electric piano, Addictive Pro for strings, and Mersenne for a “not totally unlike a clavinet” plonky sound. In the final mix, it’s hard to pick out these individual layers as they mostly blend into a textural backdrop.

Electric ukulele – I also wasn’t happy with the “guitar” bits coming from BIAB, so I scrapped them and recorded my own with my Blue Star Guitar Konablaster baritone uke going direct into the iPad through the Apogee Jam interface. This was then amped up with Yonac’s ToneStack.

EWI recordingSee the Using… post for more details on programming Model 15 for breath control. I started with the “Coastal Trails” preset discussed there as well as a tweaked “Super Lead” preset, the latter distorted through the built-in overdrive effect in Cubasis. For some unknown reason, I could not get Model 15 to work properly with either IAA mode so I resorted to a simple AudioBus route to capture the audio output. Hey, whatever works. Melody and improv was mostly taken from the Lydian scale with ample use of the sharp 4th tone including the final note.

Mixing – there isn’t much to say about the mixing process on this one. I mostly used the built-in facilities in Cubasis for EQ, compression, and volume automation. As I wrote about in a recent Using… article, my intent was to use multiple instances of Audio Reverb in Audio Unit mode to do things like different plates on snare vs kick or “vocal” treatment on leads. But it just wasn’t needed. All the drum and instrument layers provided plenty of interest and filled the sonic space without resorting to such trickery. In the end, I ended up using just two reverbs. One is a massively tweaked Audio Reverb instance just on the acoustic ukelele in the intro and outro to give that “giant cave” effect. The other is a Send to AltiSpace using the LawrenceWelkCave IR from EchoThief. This real cave is actually quite subtle and not overbearing even with fairly sizeable send levels going to it. (By the way, reading about caves is where the song title and story came from. I did not know there were birds called swiftlets that live in caves and echolocate like bats. That’s pretty cool.)

I did run into a bizarre mixdown problem in Cubasis. It would not render correctly even though real-time playback was good. Either the render wouldn’t finish or it had a constant pitch/whine included! It was like a guitar feedback signal, but I could not figure out where it was coming from. Finally I gave up and just played the Cubasis session into AudioShare through AudioBus. Problem “solved” but what the heck?!?

Mastering – I’m starting to really like Final Touch. It’s pretty close to a make it sound better” button. A tiny bit of EQ, just enough stereo sauce, and level raise barely tickling the brick wall limiter. Done.


One thought on “Production Log:Swiftlet in the Sun

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s