tick, tock… tick, tock…

IMG_2617

What you see above is still, in my opinion, the best way to do this.  Those pedals, in that order, with that amplifier, provide enough sonic options for me to do pretty much anything I want to with an electric guitar.  There’ a lot I’m not capable of, in terms of both technique and technology, but right now my focus is on documenting what I can do, instead of encroaching into the territory of what I can’t.  Quite simply, i want to be able to come home, plug a guitar into that set-up, and, should the moment so strike me, press the record button and have a document of that evening’s inspiration.

Now, of course I have the capacity to do just that with the present technology.  Even if I don’t want to set up the computer and the A/D converter and Logic (or even GarageBand), I can just do like FBdN and record right to iPhone OS.  Or I could use the old minidisc.  I could even go right to an old cassette multi-track.

The problem is TIME.

Not that these things take time, but rather that I want to be able to take whatever I’m doing and bring it back into a digital audio workstation later.  That’s no problem, unless there are going to be loops.  And there are always loops.  If a song sketch is off by even a tenth of a second, that means that after a minute, the deviation is 6 seconds – that’s not music, that’s chaos.

So the question now is: how can I get a master clock associated with an impromptu recording of a song sketch?  Playing along with a metronome is not enough.  Whatever device records the initial jam has to associate a bpm with that snippet, which can then control other music.  I’m not sure this is possible.

The more sensible approach would be to take the impromptu performance as inspiration and then properly construct music around that, starting with the beat and locking in the time.  Of course, this means recreating that moment of creativity in a very sterile setting.  Much less fun and much more time consuming.

All of this would be a piece of cake if I (a) had no day job and (b) were fabulously wealthy.

Play it like it’s an instrument!

Propeller Head 1Propellerheads Reason 4.0 continues to fascinate me.  It is a synthesizer and a sequencer and it can do a fistful of other tasks, but it is not a DAW – digital audio workstation.  It can’t record sound from a microphone (as far as I know) and it doesn’t want you to hook up a guitar to try and control it’s various sound generating faculties that way.  And if you want to make a recording that uses Reason with live instruments (which is exactly what I’m trying to do), then you have to hook Reason up through a DAW (I use Apple Logic Eight) and that adds a layer of complication.

I have previously mentioned that I want to be able to use this software with the same ease and enjoyment that I bring to guitar, with all its effects, and pedals and signal processors.  The problem has been that I’ve been hung up with the recording aspect and connecting Reason to Logic.  For some reason (no pun intended), it has taken a little while to get my head around the aux channel strip and the way it doesn’t record onto the editing window of the DAW, but it’s still there nonetheless.  These are important concepts for me to become familiar with, but I was not having the joy.

Last night I decided to move in a different direction.  I started listening to some music in iTunes and, with the music still playing, I started reason, with the Axiom 25 keyboard attached, and just started to try and play along, fiddling to find the right sound in Reason.  Then it hit me – this is how I learned how to play guitar!  Not by looking at books or videos, but by putting on the music that I love and trying to imitate melody and tone as best I can.  Now, the problem is that I don’t have any skills on keyboard/piano AND I’m playing a instrument that has 25 keys, instead of the preferred 88.  But even with those issues, I still felt like I was able to move forward.

Until I tried to do something with drums.  Reason was first recommended to me as a great source of loops and percussion potential.  Now, I may be worse on a drum kit than I am on a piano, but I’m not above tapping pads to get something like a usable rhythm.  And, Reason, like any sequencer, has a quantize function which puts the notes you play in the right spot.  You still have to play the right notes.  The problem is that the Axiom 25 doesn’t work for Reason’s drum computer.  I don’t know why this is, but I’ve confirmed with some research that this controller is not going to let me do what I want to do in Reason.

So, what does this mean?  Time to go shopping!  If I can get an Akai MPD 24 or 32 for $100 – $125, I’m just going to go for it.  Craigslist showed some potential for both items and they’re both supposed to work with Reason 4.0, although I may need some driver updates from Akai.  I’m looking forward to seeing what happens next.

New original musical microblogging – POWERLESS

Photo 28

Notice Uncle Leo hiding behind the guitar in the picture above.

[audio:http://mpomy.com/Music/Powerless.mp3]

Powerless – this is more musical microblogging, very much in the spirit of Hector & Achilles, but with better software.  I’m not sure if it’s a better result.  Anyway, the goal was to get both Guitar Rig and Reason running through Logic.  That worked great, except I couldn’t quite figure out how to use the Logic instruments without triggering the Reason stuff.  And the video tutorial I was working from sucked, so there was still a lot of trial and error.  And Uncle Leo kept walking on the laptop – big surprise there.

So this is a pair of Dr Rex rhythms coming through Reason (first one then both patterns), acoustic piano from the Reason NN-19 sampler (but using reverb and chorus from Logic) and a Thor patch called Alan Turnig’s Dream, also through Reason.

Guitar was supplied by my old Travis Bean, still in need of some cosmetic repair work.  Sadly, it was too late at night to run live out of the Lab Series amplifier, so I settled for a cheezy preset from Guitar Rig.  You have to hand it too Native Instruments, even the feedback from the Bean’s high output humbuckers is faithfully modeled.  Basically this is supposed to be a Vox AC-30 (2×12) with a bit of delay.  I don’t like these digital models for guitar, but they are damned convenient, especially after hours.

All other sounds were played on the Axiom 25 keyboard, which had to have its first hard reset because when I started all but three of the keys were non-functional.  Fortunately the reset solved that problem.

Hopefully there is decent volume.  My old GarageBand efforts are usually pretty quiet.  This may be quiet too because the meters on Logic weren’t peaking.  To my ears, though, it sounds OK.  I haven’t tried to automate any panning or really doing anything left-right at all.  Sorry – no spacial dynamics this time.  I try to learn that as I get more comfortable with the recording, mixing and mastering.

This was really just an exercise, but I’m very pleased about it’s faithfulness to the original musical microblogging concept from Fretbuzz.net – almost exactly one minute.

BeatMaker – more Pro Audio for iPhone OS

img_wave

I’ve already gotten sucked into the amazing synth app noise.io, which is an incredibly powerful synth for you iPhone or iPod Touch.  I’ve also been going back and forth trying to figure out how to realize my dream of electronic music and beats mixed with killer live guitar tones.  That has led me to Reason 4.0 and the Akai MPC.  The latter is a series of stand-alone units that work as samplers and recording studios.  An MPC style controller might be ideal for getting the most out of the powerful Reason 4.0 software, but I know nothing about working such a controller, and they’re not free.

Along comes an iPhone app with a forty-one page instructions manualBeatMaker features an interface that borrows heavily from both Reason and the MPC, complete with 16 virtual pads to tap out my imagined rhythms.  At $20, it might be one of the most expensive apps for sale at iTunes, but it’s still about ten times less than a controller that may or may not work with Reason and that I definitely don’t know how to use.  I think BeatMaker looks like a pretty good alternative.  I’m willing to bet that after I read those forty-one pages, I’ll be a lot closer to understanding what to do with reason and a good trigger-pad/MPC-style controller than I am now.  And I won’t have spent any more than $20.

The other aspect of BeatMaker that is intriguing is an app for your desktop/laptop that lets you take your beats off the phone/pod and actually do something with them.  Now we’re talkin!  And while I may be a bit far from getting that done, I’ve no doubt that, within the next few days, it’ll get done.

Even more musical microblogging

Read more about this one here.  The barnyard animals are FBdN and an unknown talent who may wish to stay unknown after he hears this interpretation of his creativity.  We just keep cranking them out!

[audio:http://mpomy.com/Music/duet.mp3]

I’m working on putting the ‘mental’ back in instruMENTAL.

More musical microblogging

71590653_9f0be51a4b

This is Hector and Achilles, composed, arranged, mixed and recorded in about two and one half hours.  It’s GarageBand with the gutar recorded via microphone through LabSeries solid state amplifier.  the guitar is an early 70’s SG with strings that need to be changed.  Some of the sounds were jacked from GarageBand extension packs obtained through bittorrent.

[audio:http://mpomy.com/Music/hecotrachilles.mp3]