I was an early upgrader to iOS 6 and one of the best features is the update to Garageband for the iPad that lets it run in the background. I didn't realize quite how amazingly useful this feature was to me until making my last piece.
I have tons of synths on my iPad that I always underutilize because it is too difficult to play them in isolation, copy the audio and then reassemble it with other tracks. There are a few other apps that allow background playing such as BM2, but I don't have that one yet.
So, with Garageband in the background acting as an "aggregator", I can slowly assemble my piece track by track and have it playing in the background while recording overlay tracks in Animoog or Sunrizer.
Here is the piece I put together with Garageband, Figure, Animoog, Sunrizer and iVoxel:
It has a middle-eastern vibe to it - the warbling vocals are actually my voice pitched up several octaves with an LFO vibrato added to it.
The main drawback to Garageband as a collecting DAW is that you only get 8 tracks to work with. This piece takes exactly 8. If you need to go beyond that, Garageband lets you bounce together tracks and combine them - so it is a pretty nice workaround if you need it.
I will probably make use of this quite a bit until AudioBus finally hits!
Meanderings about amateur music creation on iMac and iPad using Logic, Garageband or any number of software synths for the iPad
Saturday, September 29, 2012
Monday, September 24, 2012
Careful how you bite
So I recently put together a nice simple rhythmic piece on Garageband for the iPad:
It was nice to go back to the iPad and keep it pretty simple. One contributing factor was a compounded mishap with my Akai EWI and Logic's Environment. I had some MIDI filtering in place that filtered out note bends, which interestingly enough, the EWI sends when you bite lightly (lightly being the operative word there).
With the light biting not working, of course I bit harder and then near the top of the mouthpiece. I then realized the filter was on and removed it. The next time I played the EWI I was getting very weird sounds and it felt weird. I actually put a tear in the rubber mouthpiece which caused it to "leak". Worse still, this put a leak into some 'not ready for water' parts on top of the EWI.
This caused me to order a new mouthpiece and take a week off the wind instrument. So, back to the iPad and some electronica. Now that the new mouthpiece is back and working, I'll get back to EWIing. So be careful with your environments and try not to bite holes in your instruments.
It was nice to go back to the iPad and keep it pretty simple. One contributing factor was a compounded mishap with my Akai EWI and Logic's Environment. I had some MIDI filtering in place that filtered out note bends, which interestingly enough, the EWI sends when you bite lightly (lightly being the operative word there).
With the light biting not working, of course I bit harder and then near the top of the mouthpiece. I then realized the filter was on and removed it. The next time I played the EWI I was getting very weird sounds and it felt weird. I actually put a tear in the rubber mouthpiece which caused it to "leak". Worse still, this put a leak into some 'not ready for water' parts on top of the EWI.
This caused me to order a new mouthpiece and take a week off the wind instrument. So, back to the iPad and some electronica. Now that the new mouthpiece is back and working, I'll get back to EWIing. So be careful with your environments and try not to bite holes in your instruments.
Thursday, September 13, 2012
Dusty iPad!
Just felt like posting since most of my recent efforts in music making have been back to my iMac. For quite awhile I've been doing tons of iPad music creation but lately not as much even with some very cool iPad releases such as PPG Wavegenerator, Magellen, Impaktor and Drumjam.
So what gives? Have I given up on portable music? I don't think so, and truth be told, I have about 10 works in-progress that might someday see the light of day. I think what is happening is that I am obsessing on two products: Noatikl and my EWI USB. Noatikl lets me "program" musical scores and control as much or as little of the resulting work as I want which I find incredibly inspiring.
On the other end of the spectrum is my old desire to perform or play music. My trumpet chops are far out of practice and the EWI MIDI controller lets me use my breath and technique to control either virtual instruments, software modeled instruments or even synths all with a sort of "sax"/"recorder" type instrument. It really makes MIDI feel more like playing than ever before for me.
Another factor is that I really want to use all of the major new synthesizers on the iPad but the workflow, frankly sucks! I have to audio-copy/paste everything from app to app - try to get parts playing in the background etc. etc.
If you want to hear some amazing sounds from someone mastering the "only on iPad" music, check out Michael Wadlow on soundcloud. He manages to produce and master some great electronica completely on the iPad - no post-processing or DAWs at all! Even more minimalist - (meaning iPhone!) is some of the work by Galaxyexplorer. He does all of his pieces on iPhone. Both of these musicians have far more patience than I !
There is an emerging standard that may help called AudioBus which should let iPad apps share audio in realtime and process record over the shared "bus". I can' tell you how excited I am to see this develop. As things stand today, unless you are using an "all in one" type DAW on the iPad, putting together various sounds into a cohesive piece of music is an exercise in patience!
Maybe once this is in place and all the apps upgrade to use it, I will be back to more mobile music. Hell, I will probably bang out a few in the meantime.
Monday, September 3, 2012
Trade secrets!
I recently posted a quiet meditative piece somewhere between Jazz and Classical and a few folks asked for some details on how it was composed and produced. I am a bit hesitant sometimes to describe the entire process since I like to see the piece critiqued on its own merits or lack thereof, but for the very few who click through to my blog, here is the tell-all for "Echoing Thoughts"
Echoing thoughts features an imaginary combo (J422 combo - lame I know!), that I start the process with. Usually this is a small number of jazz performers. In this case I started with drum kit, upright bass, tenor sax, piano and later added a flute. As the piece progressed, the tenor sax was a bit too harsh and I fitted in a cello section which sort of made the piece somewhat classical sounding. Originally I was thinking of Boren and der Club of Gore but the piece morphed into something more New Age.
The piece was captured in Logic Express 9 with software instruments - everything is MIDI based. The piano is the Yamaha grand provided with Logic 9. The cellos come from software instruments in the Apple Symphonic Jampack. The flute comes via WIVI band - a software modeling suite that creates very realistic instrument sounds. The upright bass is also from Logic's sampler.
The piece is composed and directed via Noatikl 2 (intermorphic.com), which is a generative music engine. With Noatikl, you define the voices, whether they play melodically in a scale or play fixed rhythms or notes, what notes they play, what chords they play, the probability of each note, the number of rests versus notes - whether or not to follow another voice, etc. etc.
Basically you feed in all the parameters and tendencies that you want the musicians to follow and Noatikl, in turn, feeds these MIDI events to each track - which you can either just listen to or record. The process in Noatikl is much more akin to composing than it is performing. Its a composer's dream! You tell every musician what to do and how rigid or loose to play, and here's the great thing - they do it! Frank Zappa would be drooling if he were alive to see it.
The piece was started with the drum voices - simple kick, ride cymbal and snare - very minimalist. I created 3 differing patterns for the cymbal and snare and assigned probabilities to them. One main pattern played "most" of the time with alternating triplets and variations mixing in occasionally. Each of these voices was assigned to an acoustically sampled drum kit on MIDI channel 10 in Logic.
The bass plays a slowly evolving pattern of either whole or half notes with a restricted range of notes. The cello plays a slow pattern with the flute eventually playing a following pattern one beat after the cello.
For the piano, there are 2 voices - one per "hand". The left voice plays 3 or 4 note chords in C-minor on the lower part of the keyboard. There is minor variation in timing to make some chords feel a little bit arpeggiated. The "right hand" is another voice, higher up the keyboard playing only one note at a time. Both of these are fed into MIDI channel 6 for the piano sounds.
Once composed and I liked the sound, I recorded the parts into several tracks in Logic 9 that were set up on individual MIDI channels. I did some very minor editing post - MIDI capture to find a decent stopping point.
That is basically the whole process. Below is a screen capture of Noatikl with the MIDI parts "wired" to the MIDI channels. There are pages and pages of rules that also come into play such as the scale (C Minor in this case), the probability of each note, whether to evolve patterns or not etc.
Echoing thoughts features an imaginary combo (J422 combo - lame I know!), that I start the process with. Usually this is a small number of jazz performers. In this case I started with drum kit, upright bass, tenor sax, piano and later added a flute. As the piece progressed, the tenor sax was a bit too harsh and I fitted in a cello section which sort of made the piece somewhat classical sounding. Originally I was thinking of Boren and der Club of Gore but the piece morphed into something more New Age.
The piece was captured in Logic Express 9 with software instruments - everything is MIDI based. The piano is the Yamaha grand provided with Logic 9. The cellos come from software instruments in the Apple Symphonic Jampack. The flute comes via WIVI band - a software modeling suite that creates very realistic instrument sounds. The upright bass is also from Logic's sampler.
The piece is composed and directed via Noatikl 2 (intermorphic.com), which is a generative music engine. With Noatikl, you define the voices, whether they play melodically in a scale or play fixed rhythms or notes, what notes they play, what chords they play, the probability of each note, the number of rests versus notes - whether or not to follow another voice, etc. etc.
Basically you feed in all the parameters and tendencies that you want the musicians to follow and Noatikl, in turn, feeds these MIDI events to each track - which you can either just listen to or record. The process in Noatikl is much more akin to composing than it is performing. Its a composer's dream! You tell every musician what to do and how rigid or loose to play, and here's the great thing - they do it! Frank Zappa would be drooling if he were alive to see it.
The piece was started with the drum voices - simple kick, ride cymbal and snare - very minimalist. I created 3 differing patterns for the cymbal and snare and assigned probabilities to them. One main pattern played "most" of the time with alternating triplets and variations mixing in occasionally. Each of these voices was assigned to an acoustically sampled drum kit on MIDI channel 10 in Logic.
The bass plays a slowly evolving pattern of either whole or half notes with a restricted range of notes. The cello plays a slow pattern with the flute eventually playing a following pattern one beat after the cello.
For the piano, there are 2 voices - one per "hand". The left voice plays 3 or 4 note chords in C-minor on the lower part of the keyboard. There is minor variation in timing to make some chords feel a little bit arpeggiated. The "right hand" is another voice, higher up the keyboard playing only one note at a time. Both of these are fed into MIDI channel 6 for the piano sounds.
Once composed and I liked the sound, I recorded the parts into several tracks in Logic 9 that were set up on individual MIDI channels. I did some very minor editing post - MIDI capture to find a decent stopping point.
That is basically the whole process. Below is a screen capture of Noatikl with the MIDI parts "wired" to the MIDI channels. There are pages and pages of rules that also come into play such as the scale (C Minor in this case), the probability of each note, whether to evolve patterns or not etc.
Subscribe to:
Posts (Atom)