Wednesday, February 19, 2014

Plumbing concerns

There have been many articles lately about inter app audio and Audiobus 2.0 on the iPad and I just wanted to put in my two cents. 

As much as I enjoy creating music on the iPad, the one area that still feels a bit "primitive" is getting apps to play nicely together. Much of this is due to Apple's app "sandboxing" but another aspect is the maturity of the whole infrastructure. 

Audiobus arose more or less "organically" outside of Apple and managed to provide a much needed but still somewhat limited means of combining apps musically. Up to 3 sources can all pipe through effects and into one target. I believe that this introduces a degree of latency, but it can be handled as long as all items are going through the same path.

IAA (Inter app audio) is Apple's solution to combining musical apps and uses internal API items not accessible to 3rd parties - not exactly fair but who's counting. IAA works with much less latency and you can concurrently use as many apps as your CPU can handle (not that many really!). 

The exciting news is the upcoming Audiobus 2 which will actually save your plugin settings (assuming that the source applications support the feature). This is slowly getting to where you almost use the iPad like a desktop DAW.

The big difference on the iPad is that you are constantly worrying about the plumbing! How many apps are open? What presets need to be in each synth to combine my music? 

This happens quite a bit in my own creations with Noatikl where each synth must be played and captured at the same time (since Noatikl has each voice harmonize and play off the others in real time). I have experimented with both and once tried mixing and matching both IAA connections with Audiobus at the same time. They don't mix! Latencies are just off enough that the resulting files were way off time-wise and it was too much work to finagle them in Auria after the fact.

This means that for my Noatikl pieces that need many concurrent synths, they all need to be Audiobus (limit - 3!), or they all need to be IAA. 

When back on my desktop using Logic Pro, I never worry about the AU connections - they just work. If I save my song, all the settings are retained, there is rarely a CPU issue and I concentrate more on the creative process than on plumbing.

As things stand now on the iPad, I'm still feeling like Super Mario at times!

Wednesday, January 22, 2014

Shameless Magazine plug - Apptronica.us

One thing I've gradually taken to is reading magazines on my iPad (when not composing). The convenience of having every issue at my fingertips has surpassed the occasionally awkward task of zooming pages etc. Furthermore, many magazines on the iPad Newsstand have built in video links, audio links etc. that are great for quick access.

Recently, Clif Johnston has put out a new magazine named apptronica that is dedicated to music-making on the iPad - the first, and best magazine of its kind that I've come across! Full disclosure, I'm a contributor to the magazine and I will be writing a small series of articles on creating music with Noatikl - similar (but simpler) than the tutorial available here.

In the upcoming series I will be focusing on using Noatikl to help compose a solo piano piece using Noatikl and IKMultimedia's iGrand Piano app. The magazine is FREE on the Apple Newsstand and I encourage you to check it out when you can.

The website (apptronica.us) also provides a free PDF version for downloading if you are reading on non-iOS devices.

I hope you follow my articles there and check in here for any more detailed discussion or posts.

Wednesday, January 1, 2014

Noatikl for iPad tutorial files (in order)

Happy New Year! I am working on some posts about Cubasis and IAA but in the meantime, here is my tutorial for Noatikl for iPad in order (these are spread amongst July and August in the blog posts).
Enjoy!

Noatikl tutorial part 1

Noatikl tutorial part 2

Noatikl tutorial part 3

Noatikl tutorial part 4

Noatikl tutorial part 5

Noatikl tutorial part 6

Noatikl tutorial part 7

Noatikl tutorial part 8

Saturday, December 21, 2013

Fun with Forgeries!

Way back in 2004 when I started posting music, my main tool was Garageband on my iMac and the main site for posting was MacJams.

I've continued posting there for 10 years now and even though it is waning in popularity, some of the best folks and feedback on my music making seemed to come from there.

Sadly the site has been up and down for the past few weeks due to technical problems but it is still one of my favorites despite posting more to Soundcloud and iCompositions of late.

Recently there was a challenge called LSP (Lost songs project) where you get the chance to create a song that should be "missing" from a commercial album. It was terrific fun with many postings emulating everything from classical to the Doors or Beatles.

I went obscure with my entry and went back to one of the early Jazz albums I fell in love with. In the early 70s there was a post-bop, fusion movement bringing electronic instruments to Jazz. Bitch's Brew by Miles Davis and the birth of Weather Report were highlights of this era. Herbie Hancock also put out a few albums in this style and my favorite was Sextant - worth popping over to Spotify for a listen!

For my LSP entry, I tried to emulate the afro-electronic jazz sound of this album, though I used more modern synths. The original used a few old ARPs and a Moog III. I used Absynth, a synth I'm trying to learn much more about, the Korg M1 legacy edition with my wind controller (EWI) and the Aalto synth (modeled after the Buchla).

While I don't even have a fraction of Herbie's talent, here is my humble attempt:





Saturday, November 23, 2013

Art, Science, Neither or Both?


If you are one of the very few that follow my blogging or music, you probably know that one of my favorite tools is Noatikl, which is essentially a super-sequencer or music composition tool.

Recently on a few pieces I received some nice feedback on my "playing" or the performance aspects of generative pieces and that got me thinking a bit on how to take that. All of the pieces that I put together and release are usually 100% MIDI or close to that which means I am using computer generated sound in one form or another.

If my pieces are synthesizer based, I am using software synths as plugins on the iPad or on my desktop. If my pieces use "traditional" instruments, I am resorting to samples played on either a keyboard or other creative MIDI input devices using Logic or Kontakt on the desktop or any number of samplers/Romplers on the iPad.

Some of my pieces are performed track by track with MIDI keyboards and then edited (for mistakes), tweaked for sound and/or timing issues and then combined. This is a bit closer to traditional performing in a studio environment. I NEVER perform anything live.

Other pieces, such as those from Noatikl are more akin to composing. I program each voice with rhythmic patters, keys to use, rests, probabilities, instructions for harmony, etc and more or less "turn them loose". This is certainly not performing in the traditional sense but is setting music into motion and then tweaking the program until it sounds "done" to me.

One aspect very close to performing is when I use the EWI (Electronic Wind instrument) in my pieces. This device requires fingering the keys, blowing into the mouthpiece and translates all of this into MIDI events that are passed into a (usually) sample-based instrument. This can be a flute, trumpet, sax or even synthesizer - all modulated and controlled with my breath. Despite the "disconnect" from an analog wind instrument, it reacts and acts almost exactly the same.

So with all that, is this just a shortcut to music? Is this just winding up a music box or turning on a player piano? I think with all the setup and parameters, its more than that but also its not quite live either - even if done in one take.

I think music has evolved further and further away from direct creation over the years and that computers are just one more step along the way.

From the initial singing or chanting, we have evolved more complicated ways of making sounds throughout history. First with pipes and blowing or percussion with sticks, then strings or harps. These evolved into mechanical versions - harpsichords, pianos. Is the musician still performing when "just" pushing keys that turn levers that make hammers strike strings? What happens when it is electronic as in an organ? Sample based as started with rotating drums or tapes?

Overall I think studio work is a mix of composing and performance and the tools used don't really define that. So when my beautiful violin part is merely a mutating rhythm based on:

<100 R 30 -30 60 60>

it is still in some way a musical composition. I get to play "George Martin" to the performers in these cases and hopefully come up with something palatable!

Here are a few pieces "generated" more or less in that fashion - one from the iPad and the other from the desktop:


Monday, November 18, 2013

Throwing hardware at it!


I tend to keep with a policy of skipping each generation of hardware and since I dutifully skipped the 4th generation iPad, I broke down and bought an iPad Air. There was a $200 buyback that I took advantage of for my ancient and unused iPad first generation as well so it wasn't quite as much of a wallet shock. I went with the 64GB model because I do store tons of music samples on my iPad and the size does matter! I opted for the full size model instead of the Mini because again, with synth apps, the size matters.

Early reports have been very good related to music making on the Air and I can confirm that most of my performance concerns on CPU have been addressed. If you follow my tutorials or postings on Noatikl,  you might remember that I used mainly Sampletank due to its ability to give 4 MIDI channels without too much CPU overhead provided you use a light DAW along with Audiobus.

So, with the Air, I decided to try a piece with a heavy DAW and with 3 concurrent synths recording. The heavy DAW is my favorite, Auria and the synths are the Dxi synth, Thor and Alchemy Mobile. I drove all 3 synths with Noatikl which would usually bring my 3rd gen iPad to its knees. Without a hitch I was able to record all three concurrently into Auria, which is a very heavy DAW by itself.

Another recent addition to music making on the iPad is inter app audio as I blogged before and I added some additional tracks using the Arturia Oberheim SEM synth and my old standby Sampletank for the violin.



Throughout the piece, Auria was responsive and the tracks all recorded without issue. I think I can now add many more concurrent tracks with Noatikl and Auria is now a go-to DAW for me. I'll be posting more as I experiment further. Thanks for reading!

Sunday, November 3, 2013

More DAW-like than ever - iOS7, Auria and Garageband

Arturia Mini with IAA transport shown

With iPad's iOS7, one of the more significant additions is inter-app audio. We've had Audiobus for quite awhile now but inter-app audio makes audio sharing more integral to the operating system (Apple is allowed to make use of all the internal routines - unlike 3rd parties - gives them a bit of an unfair advantage, but I digress).

At this time, only a handful of apps support IAA but more are adding it everyday and it will become increasingly common in the coming months. IAA makes synths on the iPad act much more like VST or AU plugins do on the desktop. You can easily plug-in compatible synths to any DAW that supports it and record right from the synth into the DAW without Audiobus routing or the need for other apps running (this means a lot on the limited memory/cpu of the iPad!).

Two DAWs that already support IAA are Auria and Apple's Garageband. In the case of Auria, you simply plug-in the synth like you would any of its insert effects. For garageband, you select an IAA track type directly. Both work reasonably well at this point and not too surprising, there are occasional bugs in both too. It will probably take a few patches/releases to get things completely stable, but IAA is already very usable.

Auria is becoming my favorite DAW lately due to its fantastic automation and effects bus but it is a bit pricey. One nice development is that if you have any of the Sugarbyte effects (Turnado, Wow) for the iPad, you don't have to pay twice to use them in Auria. It recognizes them and lets you use them as native plugins - Auria native plugins do work better than IAA provided you are using Auria.

Garageband has many great additions in the iOS7 release - the track limit is up to 16 on older iPads and 32 on the brand new ones - I think there is a purchase in my not too distant future. Another more hidden but significant feature in Garageband for iPad is that for the first time, you can easily transfer lossless mix downs out of Garageband. Garageband used to be the "roach motel" of DAWs - you could get sounds in but couldn't get them out. Now you can mix down and select "Open with" to open the mixed down AIFF in another app. Not all apps support "open in", but one significant one that is is AudioShare which in turn lets you copy/paste anywhere you wish - so for the first time, Garageband plays well with others.

The piece below was created in Auria with only Inter-app audio synths. I used the Arturia Oberheim SEM synth for the arpeggios, Waldorf's Nave for some of the leads and the Arturia Mini for the bass. There were a few crashes but the entire thing was driven from within Auria and mixed/mastered and posted. I think this will be a huge development for iPad music moving forward!

Just an aside, the Arturia synths are available on desktop for roughly $100 each - on the iPad, they are $9.99 and sound extremely close to the desktop versions!