Wednesday, October 29, 2014

State of the DAW

Its been quite awhile since I posted anything here so I thought I'd maybe take some time to catch up with how the iPad has evolved and how music making on it compares to the desktop.

As I've posted before, I use Logic Pro X on my desktop as my preferred DAW and when on the iPad what I tend to miss most is the ease of working within a single app or DAW without feeling like I'm juggling multiple apps with multiple interfaces. In the past few years, DAWs on the iPad have matured quite a bit and here is my own experiences with them.


In the 'good old days' of the iPad, compromises had to be made for memory and CPU which basically made most DAWs self-contained entities with limited music making capabilities. One of the first I used was the (still excellent) NanoStudio which had a very good internal synth coupled with a nice sample pad. What NanoStudio showed was that a traditional track-based DAW could work very well with a touch interface. Similar products out early include BeatMaker 2, Tabletop and MultiTrack DAW. Coincidentally, I have all of those.

NanoStudio and Beatmaker were the more traditional with multiple tracks of piano roll (MIDI) sounds and some support for Audio tracks. MultiTrack DAW, was more like Audacity in that it was just an audio track app and Tabletop was more or less like a self-contained Rack based Reason. But all of these were more or less "closed gardens" with limited ways to import sounds from other apps.

The holy grail of DAWs has always been to provide a unified interface that works with plug-ins and a multitude of synths, samples and sounds with effects buses, mixers, etc. More of less like Logic, Live or Cubase on the desktop.



When Garageband arrived on the iPad, I got a very polished DAW with fantastic built-in instruments, though with very limited interoperability and no automation at all. The good news for me was that I could start a project on the iPad in Garageband, tweak it all I wanted and then move it to Logic X unchanged and continue working there. The bad news, its a one-way trip but this worked (and still works) very well.


When I wanted more power on the iPad, I turned to Auria which has partnered with many professional companies to provide plug-ins as in-app purchases within their product. Prior to audio's and IAA, Auria had incredible plugins that could be purchased and used within the app. The downside was that all tracks had to be audio and my preferred method of working on music is MIDI, but it is still one of my favorite DAWs. The other downside was that their professional plugins had um...professional pricing. They were a fraction of the desktop price to be sure, but for iPad pricing, still a bit of money.

Audiobus started the revolution in inter-app processing and Apple eventually caught on and brought out inter-app audio (IAA). This changed everything and made it possible for DAWs on the iPad to act a bit more like desktop DAWs with their AU or VST plugins. This also probably ate into Auria's business model since iPad users could use IAA effects instead of buying them within Auria but that's another matter.

With Virtual MIDI, Audiobus and IAA, I should finally be able to get something approximating the Logic X DAW use. So is that possible yet? The short answer is sadly, no. But there is hope and things are close.


Steinberg has ported their Cubasis app to the iPad and more than any other DAW, this sort of has all the pieces. It is possible to use internal instruments as MIDI and they give you a credible built-in synth and sampler. It is also possible to put audio tracks in. The killer functionality for me is the ability to include IAA instruments such as Animoog, Nave or Z3tA into individual MIDI tracks. This is as close to plug-in VSTs as you'll get on the iPad. 

The only problem is that this doesn't always work all that well. I have an iPad Air (first gen, not the new one) which is limited to 1 GB of memory. Start adding in lots of IAA synths and you will quickly hit a wall. The workaround is to freeze your tracks and that should work but when you work with some "frozen" IAA tracks in Cubasis, adding additional IAA tracks doesn't always work for some reason. Also, when freezing, you have to set latency to 256ms or the frozen track will have the wrong BPM (a bug!). 

In theory, once you "freeze" a track with IAA, Cubasis should no longer care if the app is open or not since you have an audio version of the frozen track. For some reason, Cubasis still cares and on occasion will complain that it cannot open the IAA for a frozen track. 

Another limitation is that IAAs can only appear on one track. You can't easily use 3 instances of Z3tA for example. Again, freezing SHOULD work, but doesn't. I think all of the pieces are there but sadly they don't work very reliably as of yet.

There are other quirks that make it difficult to use. Cycle MIDI recording is lousy, for example. I continue to use it since it is a close as things get to a full DAW but it has a ways to go. 

Since I do almost all of my work in MIDI, this is the single most important feature for me. Easy use of external synths as MIDI instruments. Cubasis is probably as good as it gets, but it ain't there yet. 







Wednesday, February 19, 2014

Plumbing concerns

There have been many articles lately about inter app audio and Audiobus 2.0 on the iPad and I just wanted to put in my two cents. 

As much as I enjoy creating music on the iPad, the one area that still feels a bit "primitive" is getting apps to play nicely together. Much of this is due to Apple's app "sandboxing" but another aspect is the maturity of the whole infrastructure. 

Audiobus arose more or less "organically" outside of Apple and managed to provide a much needed but still somewhat limited means of combining apps musically. Up to 3 sources can all pipe through effects and into one target. I believe that this introduces a degree of latency, but it can be handled as long as all items are going through the same path.

IAA (Inter app audio) is Apple's solution to combining musical apps and uses internal API items not accessible to 3rd parties - not exactly fair but who's counting. IAA works with much less latency and you can concurrently use as many apps as your CPU can handle (not that many really!). 

The exciting news is the upcoming Audiobus 2 which will actually save your plugin settings (assuming that the source applications support the feature). This is slowly getting to where you almost use the iPad like a desktop DAW.

The big difference on the iPad is that you are constantly worrying about the plumbing! How many apps are open? What presets need to be in each synth to combine my music? 

This happens quite a bit in my own creations with Noatikl where each synth must be played and captured at the same time (since Noatikl has each voice harmonize and play off the others in real time). I have experimented with both and once tried mixing and matching both IAA connections with Audiobus at the same time. They don't mix! Latencies are just off enough that the resulting files were way off time-wise and it was too much work to finagle them in Auria after the fact.

This means that for my Noatikl pieces that need many concurrent synths, they all need to be Audiobus (limit - 3!), or they all need to be IAA. 

When back on my desktop using Logic Pro, I never worry about the AU connections - they just work. If I save my song, all the settings are retained, there is rarely a CPU issue and I concentrate more on the creative process than on plumbing.

As things stand now on the iPad, I'm still feeling like Super Mario at times!

Wednesday, January 22, 2014

Shameless Magazine plug - Apptronica.us

One thing I've gradually taken to is reading magazines on my iPad (when not composing). The convenience of having every issue at my fingertips has surpassed the occasionally awkward task of zooming pages etc. Furthermore, many magazines on the iPad Newsstand have built in video links, audio links etc. that are great for quick access.

Recently, Clif Johnston has put out a new magazine named apptronica that is dedicated to music-making on the iPad - the first, and best magazine of its kind that I've come across! Full disclosure, I'm a contributor to the magazine and I will be writing a small series of articles on creating music with Noatikl - similar (but simpler) than the tutorial available here.

In the upcoming series I will be focusing on using Noatikl to help compose a solo piano piece using Noatikl and IKMultimedia's iGrand Piano app. The magazine is FREE on the Apple Newsstand and I encourage you to check it out when you can.

The website (apptronica.us) also provides a free PDF version for downloading if you are reading on non-iOS devices.

I hope you follow my articles there and check in here for any more detailed discussion or posts.

Wednesday, January 1, 2014

Noatikl for iPad tutorial files (in order)

Happy New Year! I am working on some posts about Cubasis and IAA but in the meantime, here is my tutorial for Noatikl for iPad in order (these are spread amongst July and August in the blog posts).
Enjoy!

Noatikl tutorial part 1

Noatikl tutorial part 2

Noatikl tutorial part 3

Noatikl tutorial part 4

Noatikl tutorial part 5

Noatikl tutorial part 6

Noatikl tutorial part 7

Noatikl tutorial part 8

Saturday, December 21, 2013

Fun with Forgeries!

Way back in 2004 when I started posting music, my main tool was Garageband on my iMac and the main site for posting was MacJams.

I've continued posting there for 10 years now and even though it is waning in popularity, some of the best folks and feedback on my music making seemed to come from there.

Sadly the site has been up and down for the past few weeks due to technical problems but it is still one of my favorites despite posting more to Soundcloud and iCompositions of late.

Recently there was a challenge called LSP (Lost songs project) where you get the chance to create a song that should be "missing" from a commercial album. It was terrific fun with many postings emulating everything from classical to the Doors or Beatles.

I went obscure with my entry and went back to one of the early Jazz albums I fell in love with. In the early 70s there was a post-bop, fusion movement bringing electronic instruments to Jazz. Bitch's Brew by Miles Davis and the birth of Weather Report were highlights of this era. Herbie Hancock also put out a few albums in this style and my favorite was Sextant - worth popping over to Spotify for a listen!

For my LSP entry, I tried to emulate the afro-electronic jazz sound of this album, though I used more modern synths. The original used a few old ARPs and a Moog III. I used Absynth, a synth I'm trying to learn much more about, the Korg M1 legacy edition with my wind controller (EWI) and the Aalto synth (modeled after the Buchla).

While I don't even have a fraction of Herbie's talent, here is my humble attempt:





Saturday, November 23, 2013

Art, Science, Neither or Both?


If you are one of the very few that follow my blogging or music, you probably know that one of my favorite tools is Noatikl, which is essentially a super-sequencer or music composition tool.

Recently on a few pieces I received some nice feedback on my "playing" or the performance aspects of generative pieces and that got me thinking a bit on how to take that. All of the pieces that I put together and release are usually 100% MIDI or close to that which means I am using computer generated sound in one form or another.

If my pieces are synthesizer based, I am using software synths as plugins on the iPad or on my desktop. If my pieces use "traditional" instruments, I am resorting to samples played on either a keyboard or other creative MIDI input devices using Logic or Kontakt on the desktop or any number of samplers/Romplers on the iPad.

Some of my pieces are performed track by track with MIDI keyboards and then edited (for mistakes), tweaked for sound and/or timing issues and then combined. This is a bit closer to traditional performing in a studio environment. I NEVER perform anything live.

Other pieces, such as those from Noatikl are more akin to composing. I program each voice with rhythmic patters, keys to use, rests, probabilities, instructions for harmony, etc and more or less "turn them loose". This is certainly not performing in the traditional sense but is setting music into motion and then tweaking the program until it sounds "done" to me.

One aspect very close to performing is when I use the EWI (Electronic Wind instrument) in my pieces. This device requires fingering the keys, blowing into the mouthpiece and translates all of this into MIDI events that are passed into a (usually) sample-based instrument. This can be a flute, trumpet, sax or even synthesizer - all modulated and controlled with my breath. Despite the "disconnect" from an analog wind instrument, it reacts and acts almost exactly the same.

So with all that, is this just a shortcut to music? Is this just winding up a music box or turning on a player piano? I think with all the setup and parameters, its more than that but also its not quite live either - even if done in one take.

I think music has evolved further and further away from direct creation over the years and that computers are just one more step along the way.

From the initial singing or chanting, we have evolved more complicated ways of making sounds throughout history. First with pipes and blowing or percussion with sticks, then strings or harps. These evolved into mechanical versions - harpsichords, pianos. Is the musician still performing when "just" pushing keys that turn levers that make hammers strike strings? What happens when it is electronic as in an organ? Sample based as started with rotating drums or tapes?

Overall I think studio work is a mix of composing and performance and the tools used don't really define that. So when my beautiful violin part is merely a mutating rhythm based on:

<100 R 30 -30 60 60>

it is still in some way a musical composition. I get to play "George Martin" to the performers in these cases and hopefully come up with something palatable!

Here are a few pieces "generated" more or less in that fashion - one from the iPad and the other from the desktop:


Monday, November 18, 2013

Throwing hardware at it!


I tend to keep with a policy of skipping each generation of hardware and since I dutifully skipped the 4th generation iPad, I broke down and bought an iPad Air. There was a $200 buyback that I took advantage of for my ancient and unused iPad first generation as well so it wasn't quite as much of a wallet shock. I went with the 64GB model because I do store tons of music samples on my iPad and the size does matter! I opted for the full size model instead of the Mini because again, with synth apps, the size matters.

Early reports have been very good related to music making on the Air and I can confirm that most of my performance concerns on CPU have been addressed. If you follow my tutorials or postings on Noatikl,  you might remember that I used mainly Sampletank due to its ability to give 4 MIDI channels without too much CPU overhead provided you use a light DAW along with Audiobus.

So, with the Air, I decided to try a piece with a heavy DAW and with 3 concurrent synths recording. The heavy DAW is my favorite, Auria and the synths are the Dxi synth, Thor and Alchemy Mobile. I drove all 3 synths with Noatikl which would usually bring my 3rd gen iPad to its knees. Without a hitch I was able to record all three concurrently into Auria, which is a very heavy DAW by itself.

Another recent addition to music making on the iPad is inter app audio as I blogged before and I added some additional tracks using the Arturia Oberheim SEM synth and my old standby Sampletank for the violin.



Throughout the piece, Auria was responsive and the tracks all recorded without issue. I think I can now add many more concurrent tracks with Noatikl and Auria is now a go-to DAW for me. I'll be posting more as I experiment further. Thanks for reading!