Well, I took the plunge and got the Samplemodeling saxophones. They have a new release out that uses their own sound manager instead of Kontakt and I decided to add better saxes into my mix. In the past I've been relying on old sampled versions or the Wivi Band Tenor Sax. Of these, the Wivi Tenor isn't bad but none are in the same league as Samplemodeling.
The new version includes 4 saxes - bari, tenor, alto and soprano. So far I find the sound with the Akai EWI to be terrific. I quickly put together a basic jazz piece featuring the bari, tenor and alto saxes:
For this piece I took a basic jazz brush kit for drums, added a basic bassline with the Bari sax, a repeating theme with the Tenor and a solo throughout the piece with the Alto. The performance is far from flawless, but the intonations, bends and note realism really come through well. As I get better with the EWI, I expect to use these extensively.
One thing to note is that the engine is a bit CPU hungry. I noticed with two samplemodeling tracks that my CPU was almost spiking (2.5 GHz Core2Duo). I chose to "bounce in place" the bari and tenor which freed up the CPU for the Alto part. I didn't use the soprano in this one. Setup was very easy and using these instruments is almost effortless with a good wind controller such as the Akai EWI.
Meanderings about amateur music creation on iMac and iPad using Logic, Garageband or any number of software synths for the iPad
Saturday, January 26, 2013
Saturday, January 12, 2013
How to inefficiently produce!
I've been experimenting with more apps on the iPad that I have been underutilizing and put together a electro-funk piece in kitchen-sink fashion:
I don't present my workflow as a model of efficiency, but for anyone looking to compose on the iPad, here are some thoughts from making this piece.
First, the hardware: The iPad is "sandboxed" which means that all apps are isolated from one another (trust me, overall this is a GOOD thing). This is why its easy for almost anyone to pick up and use it without worrying about where and how their files are stored. All files relevant to an app are stored with it...period! The downside is that having many apps working on the same files is mostly a no-go. In music making, this is somewhat alleviated by 3 technologies on the iPad:
I don't present my workflow as a model of efficiency, but for anyone looking to compose on the iPad, here are some thoughts from making this piece.
First, the hardware: The iPad is "sandboxed" which means that all apps are isolated from one another (trust me, overall this is a GOOD thing). This is why its easy for almost anyone to pick up and use it without worrying about where and how their files are stored. All files relevant to an app are stored with it...period! The downside is that having many apps working on the same files is mostly a no-go. In music making, this is somewhat alleviated by 3 technologies on the iPad:
- Audiocopy/Audiopaste - This is the simplest and has been adopted by most serious music apps on the iPad. Like clipboards for documents, you can copy audio from one app and usually paste it into another. Some apps only support Audiocopy, some only audiopaste depending on the nature of the app.
- Virtual MIDI - Many apps support virtual midi and will send their MIDI events to other running apps or will process MIDI events from other running apps. This is analogous to wiring MIDI hardware together but is done virtually over one of the 16 MIDI channels.
- Audiobus - The newest technology that many apps are now supporting! This allows the audio of any app to be chained to other effects or target apps. Similar to virtual MIDI but its audio signals that flow from app to app and not MIDI codes.
The piece above, used all 3 technologies in a somewhat cumbersome fashion! Funktank uses several apps to combine its sounds. The apps used to play the sounds are: Sampletank, Alchemy and Animoog. These apps were combined using a Virtual MIDI sequencer and two DAWs: Genome for the MIDI sequencing, Multitrack DAW to record the Audiobus from Animoog and finally Auria to mix and master all of the tracks.
Sampletank has some of the best sampled sounds on the iPad and was used in Funktank for the drums, the bass, the rhythm guitar and the synth lead. As good as the sounds are, its interface is a confusing mess and its recording is frustratingly slow. You can put together up to 4 instruments, get sounds going in each but to record, you have to record each one one at a time! There is no audiobus support yet and I truly hope its added soon. In Funktank, I decided to use the Genome MIDI sequencer to create rhythm patterns rather than using Sampletank natively and banging on drumpads. This worked pretty well but the recording was particularly cumbersome since there is no way to record in Virtual MIDI. Instead, I had to go to Sampletank, hit record, switch (quickly!) to Genome, "play" the pattern I wanted recorded, then go back to Sampletank and stop the recording. Then I used audiocopy and audiopaste to put the WAV file into my DAW - Auria in this case. Finally, I edited the sound clip to remove initial and trailing blank space - simple no? NO!
Genome lets you remotely sequence any other running app that supports virtual MIDI. In fact, Genome does not make any sound at all on its own - it has to be "wired up to something". In addition to apps, it can use wireless MIDI to control your desktop DAW which is pretty cool! One thing to be careful of on the iPad is the memory and CPU limits. If you have 4 or 5 running apps at once, you can easily run into memory or processing problems. This is why Sampletank was appealing. Most apps only take in one MIDI stream at a time on one channel. Sampletank will take 4 separate MIDI channels at once so for one running background app, you can have 4 voices going. This is where I got sidetracked in my workflow. I worked out pattern after pattern (each one a single bar!) and programmed them into Genome. I could hit play and go through step after step and worked out the first 20 bars of so of the song. Sounded great to me! But...there was no easy way to record it! I could resort to wiring up the headphone jack and recording, but I wanted to have more flexibility with the editing in the DAW. I should have just recorded one set of patterns for each instrument, recorded them as above and done all of the song creation in the DAW. So I painfully played each unique pattern, recored each single part, audiocopied/pasted and edited into the Auria DAW. This took a long time!
Auria is the DAW I used primarily in this piece. As DAWs go, it is by far the most complete with its own brand of VST support for plugins, rich architecture and support for 48 concurrent live tracks if set up with audio interfaces. In my case - overkill! I was only audiopasting in tracks and then applying light effects such as compression or reverb but it is extremely powerful! Once I got all of my "pieces" and patterns recorded, this is where I combined and processed the audio. Auria does not yet support audiobus so I can't just directly record there but it is coming soon and that will greatly simplify this whole process. The best practice for Auria would be to record the tracks "dry" with no effects and to then apply them in Auria. Insert effects, send effects are all supported. I did not do that in this piece though. The sound from sampletank was already more or less processed as I liked. Since I didn't really do much rich effects processing in Auria, I could have used Multi-track DAW instead which has very basic effects but supports Audiobus. I really wanted to get familiar with Auria though so that is what I used.
Animoog by Moog music is one of my favorite synths on the iPad but it was used only for a few bell like ambient backgrounds in this piece. Animoog does have its own 4-track recorder built in but since it supports Audiobus, it is much simpler to simply connect it to a DAW and record into the DAW. I used Multitrack DAW to record the Animoog part. To make things easier, I mixed down the other parts, audiocopied/pasted them into Multitrack DAW so I could hear everything at once while recording.
Once I liked the track in Multitrack DAW, I audiocopied/pasted the Animoog track into Auria with the other tracks.
Alchemy is a synth that I haven't used that much on the iPad yet but is very fun to work with. Alchemy runs mainly from rich synth samples but lets you tweak tons of parameters for every sample giving you many variations. Alchemy supports audiocopy/paste but not Audiobus yet so I recorded, copied and pasted the Alchemy parts of Funktank. The "pseudo-vocals" and some of the ambient backgrounds came from alchemy.
When all of this was assembled into Auria, I worked there to mix, copy/paste and master the final piece. This was more or less like working in any DAW. Lots of audio file manipulation, dragging and copying and then finally mixing the whole piece down. Auria works very much like desktop DAWs - maybe a bit too much like them. Audio editing is very interactive with full touch support but many functions require navigating menus with your fingers which while easy enough, is not revolutionary. Due to the great depth of features though, this is probably unavoidable. Multi-track DAW on the other hand is less full featured but much more touch friendly.
So what do I conclude from all of this? I enjoyed using Auria once I got everything there and will probably use it again but for an efficient workflow I think I would recommend using Audiobus with compatible apps. The good news is that Auria should support it shortly which I look forward to.
As things stand, it is possible to make music solely on the iPad with multiple tools, but still far from optimal! Every month though is bringing new and better tools. As cumbersome as this was, I had a blast figuring all this out.
Subscribe to:
Posts (Atom)