Challenging musical weekend (gear and band, both…)

I can’t believe I survived this weekend.  To say that it was challenging was a MAJOR understatement.  Two shows, one Friday night with my new band, and one Saturday running sound for a very popular band, had me very stressed out.  The last month or two has been some changes to my equipment, which caused a not so inconsiderable amount of grief.  I just wanted to put down some thoughts on what worked and didn’t, and what I learned (and need to continue to work at)

  • What worked
    • The van.  Best idea I’ve had in FOREVER… I picked up an ’05 Dodge Grand Caravan with Stow-and-go seats.  By putting the seats away, I basically have a very nice cargo van, and the PA fits wonderfully in the back.
    • The ’59 Les Paul reissue.  This is my new main guitar, played it at my gig on Friday for the first time.  I’ve been a PRS only guitarist for many years.  I’ve tried some different guitars, including Les Pauls, but nothing has really ‘stuck’.  Enter a Les Paul Custom Pro that I picked up two years ago.  This guitar rocked my world.  Good quality, great sounds, and very versatile.  This had me revisit my ‘no Gibson’ stance.  I happened across a WONDERFUL 2013 Les Paul that is a reissue of a 1959 Les Paul.  This is the best guitar I’ve played in both feel and tone.  I have some awesome PRS guitars, including Private Stocks, and the Les Paul just captures something that the PRS don’t.
    • The Axe-FX II XL.  This is a digital modeling platform for guitarist.  I’ve been struggling to get a great sound out of my tube amps and pedals that I own.  The amp + pedals have been wildly inconsistent.  Some nights, the tone is amazing, and other nights are a struggle.  To get it to sound even remotely good, there needs to be some volume, and the way that most guitar speakers are oriented, the sound is aimed at the player’s knees.  The Axe gives me a consistent sound that can be run through a monitor rather than a guitar cab and the monitor can be aimed at my head.  A lot less volume is needed, and the sound is phenomenal.  It needs to be tweaked for the PA a bit better, but that is an easy fix.  I did have one issue where the wah pedal was engaged on each setting by default.  I had to turn the wah off each time I changed patches.  Easy enough to fix, but needs be done before next practice.
    • Husky roller crate.  Bought a crate with wheels for all of the cables and ‘stuff’.  Makes for a lot less trips between the van and the gig.
    • Third and fourth sets with the new band.  Things came together, and people danced almost from beginning to the end of the sets.
  • What was OK
    • The Mackie DL32R.  This has been a godsend and a curse all at the same time.  My Line 6 mixer only has 12 XLR inputs and 4 guitar inputs.  For the sound gigs I’ve been doing, 16 inputs is not enough.  Mackie just came out with this new mixer that allows for 32 inputs, 8 of which have the dual 1/4″ / XLR connectors, and 14 outputs.  And, the mixer fits in a 3 space rack unit that is very portable.  The mixer relies on having an iPad and a wifi router to work.  So far, the mixer is working well, but I am missing some features that the Line 6 mixer had.  Multi band compression, separate limiter from the regular compressor, built in feedback suppression, a spectrograph on the EQ screen for each channel, output level views and few presets have made me feel like I’ve taken a step backwards.  I’m hoping that NAMM will bring an updated release to the firmware & iPad software to add some of these features.
  • What sucked
    • Trying to play with a new band and set up a new mix at the same time.  This was a VERY bad idea.  First problem was getting to the gig only an hour and a half before we were supposed to play.  Normally, that’s not a problem, as I have presets set up for my old mixer.  The new mixer, however, had not been used for the current band.  I was struggling all evening to mix and play, which never works out well.  There were a couple of points where I just had mental breakdowns trying to solve problems.  Sorry guys.
    • My knowledge.  One thing that I haven’t been doing at home lately doing any mixing.  I’ve been running sound for bands, but not really doing my ‘homework’ of figuring out how the new systems react.  I’ve been a bit spoiled by the Line 6 Stagescape mixer, as the presets are darned good, and usually require minimal changes to get a good sound.  The Mackie mixer doesn’t have all of the presets, so I have to figure out how to get things to sound great.  After Friday night’s gig, I spent some serious time on Saturday working on understanding some EQ curves for the vocals.  Saturday, I was able to get a much better sound for the band.
    • Loading in and out and setup.  For some reason, even though I’ve bought containers for the equipment for less trips, it seemed to take longer to setup and tear down.  The Saturday gig, we arrived 2 1/2 hours before the show, and we still felt rushed, and didn’t get a proper sound check.  Last Saturday’s gig was similar.  We really don’t have any more equipment than before, but things seem to be taking longer, rather than shorter.
    • The Line 6 speakers.  This is the most frustrating one.  The speakers sound AMAZING.  BUT… one of the speakers has some gawd-awful issue where it starts to ‘splutter’.  This is the digital equivalent of a loose cable.  The problem is that if the speaker starts to do this, all of the OTHER speakers are affected.  This happened last night almost all night.  I thought it might be a bad power line, as it seemed that if I moved the power cable up, or held it up, the nonsense stopped.  I was able to swap the cables, but that brought on a DIFFERENT problem.  I either reset the speaker too fast, or something else was wrong, because at that point, the downstream speakers stopped getting signal.  Part of the issue might have been that I was daisy chaining two AES cables together (NOT recommended).  In the end, I was able to resolve the issues, but I now have a lack of faith in the system.
    • First two sets with the new band.  Between trying to fight with the PA, being late to start, playing a song we’ve never played (with a guest singer, no less), and having two new members of the group, the first two sets were, shall we say, ah, rough?  By the middle of the second set, everything had settled down, and hopefully, no permanent damage was done.

Last part of my thoughts is some solutions:

  • Buy the correct digital cable from Line 6.  A 50′ cable exists that Line 6 recommends.  Not cheap, but certainly cheaper than loosing the gigs.
  • Try running the Line 6 speakers in ‘analog’ mode, as if they were just regular speakers.  This would eliminate the problem of one speaker going crazy and taking out the whole PA.
  • Creating a check list for setup & tear down.  That way, the next step is alway visible.
  • Practice, both with my playing and my mixing.
  • Get in contact with the people at Mackie and either get on the beta group or at least contact the product manager to give suggestions.

Hopefully this weekend’s lessons will be learned and solutions applied in the future!

Advertisements

Last one for the day… follow-up to my WWDC predictions

https://dscheidt.wordpress.com/2014/06/01/predictions-for-apples-wwdc-event/

Why am I revisiting this NOW?

I thought it was interesting that my ‘predictions’ were almost 100% accurate, if the timeframe is extended out through the end of the year.  The 4k iMac showed up (or rather, the 5k iMac).  So did the Mac Mini.  I was dead on with the port and option predictions.  The only real disappointment is that the Minis are dual core only (no quad core).  I still think the 5k iMac was slated to show up at WWDC, but was probably pulled due to monitor availability.  Also, 10.10 definitely got the flat look.  I thought I would hate the flat style, but once I started running 10.10, the older style looked very dated… If I really push it, the ‘new product’ could be the Apple Watch 🙂  I’m still not sure about the Watch, will have to wait and see.  I know people who have smart watches, and seem to like them, but I haven’t figured out how *I* would use it.  The iPhone and iPad, I instantly knew where I would use them (at least initially!), but the Watch is still a bit of a head scratcher FOR ME.

Still, it was a very ‘Apple’ year with the iPhone 6 and 6 Plus and the iPad Air 2.  I’ve had the 6 Plus since day 2, and have really gotten used to it.  Now, even the 5 / 5s feels ‘small and square’.  The iPad Air 2 has been a good upgrade, enough so that I did purchase one.

Now, let’s get the Mac Pros updated to the upcoming Intel chips and get the 12″ Retina Macbook Air out the door!

Starting a little series… Knockout.js and ASP.NET Web Forms

With my new job, I’m back to programming.  Which means, more programming articles 🙂  So, if programming in .NET isn’t your thing, you can skip these 🙂

A couple of months ago, at my old job, I was speaking with a programmer who was interested in using Knockout.js, but couldn’t because the project that he was working on was an ASP.NET WebForms project.  At the time, all I’d seen of Knockout.js was referred to using the ASP.NET MVC type projects, so I agreed with him that it was a shame, and moved on.  Boy, do I wish I’d been a little more informed at that time.

It turns out that one can use Knockout.js and ASP.NET WebForms JUST FINE TOGETHER!  It does take thinking a bit different about what the ASP.NET WebForms do, and it is sort of bastardizing the whole WebForms concept, but it can be done, and it’s fairly easy 🙂  By using this style of code, the WebForms become more like MVC pages rather than the usual “onPostBack” code.

The first step to understanding how Knockout.js and WebForms can work together is to learn that .aspx pages can expose WebMethods.  This fact is important, because web methods can send and receive JSON data.  Using JSON (and BSON), one can pass the data from a class to a website via the web service call and a web service call can receive JSON data and turn it back into a class.  There are lots of good articles on using WebMethods in WebForms (example: http://encosia.com/using-jquery-to-directly-call-aspnet-ajax-page-methods/).  The example that I’ve recently found shows how to tie the Knockout.js data binding to the WebMethods.  http://www.codeproject.com/Articles/692793/Introduction-to-Knockout-js-and-CRUD-Operations-in.

One of the reasons I say that I’m starting a series is because I want to point out the little ‘gotchas’ that pop up when doing some of this coding.  I’m still in the learning (and stealing of ideas) stage, and I want to document what I’m seeing 🙂  All of the things I’m writing at the moment are coming from issues that my new team is running into.

The first gotcha that we ran into was when creating a default Microsoft web forms project.  The latest default project has a ton of great stuff set up, but some things that have been preset get in the way.  Problem number one is AutoRedirectMode in the AppStart/RouteConfig.cs or AppStart/RouteConfig.vb file.  The default setting is RedirectMode.Permanent.  This needs to be changed to RedirectMode.Off to get the web method calls to work.  (See this article:  http://stackoverflow.com/questions/23667083/i-cannot-get-my-webmethod-to-work-in-asp-net.  Note that the answer is in the comments, not the answers)

There are more challenges that we had to work through, so more article are one their way!

Wow, 10 years, where does the time go?

Wow, this blog has been (mostly) active over 10 years.  My writing has been pretty quiet since September.  Not because of lack of activity, but for the complete opposite reason!  Too much going on!  One of the major things that has happened is that I left my job with my previous company, and have started working with a much smaller company.  I’ve moved from DevOps back into Development, and have actually been working with Visual Basic.NET, Windows Presentation Framework, and Javascript / jQuery / Knockout.js.  I have written more code in the last two months than I have in the last 3-4 years, and I’m VERY happy about that!  I am now also mentoring a couple of developers, and doing a LOT of learning myself.

On the music front, they ‘year-and-a-bit more’ of gear continued.  I swear I thought that I was done recently.  Time has a way of changing that, though!  I’ve picked up a LOT of sound gigs lately, and they are paying much better than the late night bar gigs.  To continue doing them, though, I’ve had to make a couple of updates, which I’ll be reviewing soon.  Plus, my taste in guitars has changed a bit recently, too.  I’ve got to say, Fender and Gibson have stepped up their game in the last couple of years, and I think that is directly correlated to the fact that PRS guitars sound and build quality is amazing.

It been an interesting ride, and it is going to get more interesting!

My guitar rig…

In all the ‘year of gear’ posts, I don’t really think I’ve discussed what has worked and what hasn’t.  I figured it might be a good time to go over what’s working for me for my guitar playing, and what hasn’t…

First up, pedals…

Things I can’t live without now:

One of the best pedals I’ve picked up lately is the Wampler Ego Compressor.  This is an awesome compressor that is very quiet, and has some great features.  One of the best features is the parallel compression dial.  This allows the original signal to go through unaffected and to be mixed in with the compressed signal.  This allows the pick attack to still come through while the rest of the signal is compressed.

Second ‘can’t live without’ pedal is the ISP Decimator II G String. This is a great noise gate pedal that has a very unique feature.  The pedal has two inputs and outputs so that you can run the pedal in two places in your signal chain.  I have the noise gate first in my effect chain, right after the guitar, and also as the first thing in my effects loop on the amp.  This setup eliminates almost all noise when I’m not playing.

Finally, my TC Electronic pedals.  The main one is the ‘can’t live without’, which is the PolyTune 2 pedal.  This is an amazing tuner that I’ve never had any issue with.  I *might* switch it up for the new PolyTune 2 mini pedal, if all the features are there :).  I also use the TC Electronic Hall of Fame Reverb, the Transition Delay, and the Gravy Chorus.  Great pedals that are easy to dial in and get a good studio sound.

More to come!

New tools to solve an old (.NET Web Service) problem (and an apology)

I ran into an interesting problem today…  One of my clients is converting a Windows app’s database and business logic to web services.  The transition has been surprisingly smooth.  Well, smooth up until today.  Since we are using the old, .asmx web services, passing user defined classes via the parameter list is almost a black art.  We were trying to pass a class, BusinessObjectFoo, from the Windows application to the web service.  The web service has a parameter that accepts BusinessObjectFoo, and both the client and the server reference the same library that contains the definition for BusinessObjectFoo.  This should be a no-brainer, and just work, right?  Unfortunately, it doesn’t.  Microsoft abandoned .asmx improvements in .NET 3.0, and left some rather serious holes in the functionality of .asmx web services.  The ability to pass a user defined class via the web service parameters happened to be one of them.

The crux of the problem is that when the client application pulls in the service reference, Visual Studio creates a new definition of the class from the web service’s WSDL rather than matching the class up with the libraries.  (Apparently, this works correctly with WCF services).  We tried several different solutions to try to allow passing the object via the web service, but no luck.

Enter my apology… in reviewing code for my normal job, I’d run across something that had been added to the code base with no explanation.  That code was for Automapper.  I got aggravated at the developer for throwing another ‘new toy’ into our project.  I did do a little reading on understanding Automapper, but didn’t see why it had been added. After today, I apologize to those devs, this one looks pretty helpful.

Back to the story 🙂

What need to be done was to move the data from the normal, shared class into one of the web service / auto generated classes.  The first try was to use an object copy routine.  That didn’t go so well.  After that, I realized that I should try the AutoMapper NuGet package, as both objects had the same field structure / interface, just different implementations.  That’s when reading up a bit on Automapper, and trying it really saved the day.  Instead of writing a big, honkin’ copy class between my shared object and the Web Service object, I just mapped the two with Automapper.  Then, it was just a simple map call to populate the correct object with all of it’s data.

VERY easy, and saved a ton of coding!

Everything old is new again (or the more things change, the more they stay the same)

Trends tend to go in cycles.  With computers, we’ve seen the old become the new back to the old, and back to the new.  Take the transformation from mainframes to PCs back to big servers.  Same thing has happened with computer languages.  One of the most notable things has been the rebirth of the command prompt…  In the dark ages, mainframe / Unix admins used the command prompt to manage the system because nothing else existed.  Then came the GUI tool.  Wonderful!  Visual!  Easy to see, easy to learn.  Unfortunately, difficult to automate.  For one or two servers, that wasn’t a problem.  As tools like Amazon’s Cloud computing, and Microsoft’s Windows Azure become more prevalent, though, managing by GUI becomes very difficult.  For Unix / Linux systems, there is a very rich scripting ecosystem.  But what about for Windows systems?

Enter Windows PowerShell

PowerShell started as a way manage systems via the command line using .NET, and has expanded to encompass everything from Exchange to SQL Server to Windows Azure.  The current version of PowerShell is vastly different from the original v1.0, with almost all commands being pipe-able and reusable.  The new remoting features are amazing.  There is even a built in web site to allow accessing the command line via a web server.  Yes, Unix/Linux admins will go ‘so what, this has all been done with Unix before’.  This is VERY true.  It’s just nice to have this built into Windows as a native component rather than as a third party add-in.  Also, PowerShell is built around .NET with all of the latest and greatest concepts such as fully dynamic objects and duck typing.

Having said all that, PowerShell is one of my top things ‘to learn’.  After checking out the initial Jump Start Course on the Microsoft Virtual Academy, I’ve been trying to ‘live in PowerShell’.  It certainly makes sense when working with multiple machines.  The remoting alone makes managing multiple web servers in a farm easier.

Fun stuff!

How I learned to stop worrying, and like Window 8.1.1 (Update 1)…

I had an interesting experience this evening…

One of my friends / family / customers was still running Windows XP on a 6 or 7 year old machine, and was concerned about Windows XP security support finally being cancelled.  The person loved the machine that I had built for them, but just wanted to upgrade.  So, my suggestion was to get a new SSD, and put Windows 8.1 Update 1 on it, as the person didn’t use many programs on the computer, mainly email, surfing, and some Microsoft Office.  The idea got approved, and I ordered a SanDisk Extreme II SSD and a copy of Windows 8.1.

Tonight, I was able to get by and install the new drive and software.  Challenge #1 turned out to be that the current hard drive in the computer was PATA!!!!  Have seen THAT in a LONG time.  I didn’t think that I had a SATA cable, but we were lucky enough to have the box from the original motherboard that did have the SATA cables in it.  Disaster #1 averted 🙂

After that, the install was *fairly* smooth.  The combination of Windows 8.1 and the SSD made the old computer MUCH faster.  Boot time dropped from over 2 to 3 minutes down to 23 seconds.  Plus, the screen looked a TON better.  The person had been running a 1920×1080 monitor / TV at a much lower resolution, so the screen looked terrible.  With Window 8.1, we were able to leave the resolution at 1920 x 1080, but change the scaling to 180%.  MUCH better!  The text was sharp but easier to read.  The system was faster and much more responsive.  Pinning the apps to the task bar felt very ‘mac like’ for some reason 🙂  Adding a network printer was so simple, it wasn’t even funny.  All in all, an excellent upgrade for a minimal amount of money.

Universal Audio’s Apollo Twin Duo quick review

I’m an equipment junkie, I’ll admit it.  The amount of guitars and recording equipment I’ve played with is well above the average hobbyist.  There are a LOT of promises out there, and rarely does the hype live up to the reality.  The Apollo Twin does live up to the hype, and more…  Now for the full story.

I’ve gone through several prosumer audio interfaces in the last couple of years.  Everything from an M-Audio Delta 66 all the way up to the Avid M-Box 3 Pro.  For the most part, none of them have been truly exceptional.  There’s always been issues of some sort with every one of them.  Getting an interface that works 100% of the time has been an absolute challenge.  My last interface, the M-Box 3 Pro was so bad, I was calling tech support to get a resolution.  After over a year and a half, 3 firmware updates, and updated drivers, the interface STILL wouldn’t work correctly, and the issue seemed to be pretty prevalent on the user forums.  Things got better over time, but it never did work correctly.

At the beginning of the year Universal Audio brought out the Apollo Twin.  To me, and my studio usage, this seemed like an ideal interface for my setup, but there was one problem.  The interface is a thunderbolt interface, but it only had one thunderbolt port.  I use an Apple Macbook Pro with thunderbolt, and I drive my external monitor off of the thunderbolt connection.  That was a real problem until the CalDigit Thunderbolt expansion showed up.  That has an HDMI port that allowed me to drive my monitor, and a second thunderbolt port that would allow me to connect the interface to the computer.

Last weekend, a fortuitous turn of events allowed me to get an Apollo Twin Duo without breaking my budget.  I was expecting good, but I’ve been jaded enough by lots of interfaces to be expecting some trouble.  Fortunately, my fears were completely unfounded.  The setup of the interface was very straightforward.  There is a link to a video and driver downloads in the box.  Following the instructions was simple and straight forward.  Once I finished, everything just WORKED.  Amazing!!!! So, I pulled up a some audio, and hit play…  I had to scrape my jaw off the floor.  I thought that the M-Box 3 Pro was supposed to be the top of the line audio.  The Apollo blew it away.  The detail on what was coming out of my speakers was freakin’ AMAZING.  For some reason, the stereo separation is much more apparent on the Apollo.  When I brought up one of the projects I was mixing (my own band’s live show), I was flabbergasted.  I had been struggling to get some balances correct.  With the new interface, I was able to hear it, and correct it almost instantly.  Because the interface is PCIe over Thunderbolt, the buffers and latency are incredible.

I did run into one issue when I first set the interface up… apparently, I had bought one that had been tested and returned to the place I bought it from.  This meant that the first night I had it, a Sunday, I couldn’t register to get all of the plug-ins that are part of the package.  I sent a support ticket in, and called to Tech support the next day.  They were able to clear up the registration very quickly, with a minimum of fuss.  They did a great job.

Once I was able to get the plug-ins installed and working, I did a bit of testing… nothing scientific, just replacing some of my other plug-ins that are models of similar equipment to the Apollo’s plug-ins.  Again, blown away is the least I can say about them.  Just switching to the LA-2A compressors in the latest package was like taking a video from 2D to 3D.  The detail is just amazing.  And, the preamp modeling is just crazy.  Running the included 610-B on a guitar input before sending to Amplitube warmed up the signal significantly.  I imagine that running it on vocals is even better.

Ok, enough gushing… what are the drawbacks?  #1 is that for a basic interface, it is expensive.  It’s worth it, but that’s hard to explain.  #2 is the endless parade of plug-ins are not cheap in two different ways.  One is money, and two is the processor requirements.  I can’t imagine buying the Twin Solo with just one processor.  I’ve already pushed the Duo to 50% processing power with just a small number of plugin instances.  Fortunately, it is fairly easy to expand the processing power by buying the expansion units.  I’m hoping that Universal Audio will start making the expansions with Thunderbolt instead of Firewire (and have pass thru functionality!).  I can see an OCTO processor in my future if I keep using the plug-ins!

All-in-all, the Apollo Twin Duo is a great piece of equipment for anyone recording or mixing and doesn’t need a ton of I/O.

Predictions for Apple’s WWDC event

So, tomorrow is Apple’s WWDC (World Wide Developer’s Conference) keynote speech.  This is the first Apple-note of the year, and it’s going to be primarily focused on developers.  So far, the banners have shown iOS 8 and OS X 10.10.  What else is are we in store for is anyone’s guess.  I have my own predictions 🙂  I have 0 (zero) knowledge of what’s going to be done tomorrow, and even less facts, but I like to at least think through what *could* show up!

  1. 4k iMacs.  Yes, I know that Jim Dalrymple said ‘nope’ to a tweet saying no iMacs, but if you read the tweet, it is says ‘low cost iMacs’.  Why do I think that 4k iMacs are gonna show?  Well, for one, OS X 10.9.3 just came out with a complete 4K update.  Why do it now?  The current 4k capable macs have been out for a while (the retina MacBook Pros and the Mac Pro), so there’s no need to have the 4k feature added for them.  If it’s for a product that is due later in the year, then the update could have been addressed later.  I think that the 4k features had to be added before something could ship.
  2. If 4k iMacs show up, I bet a new Cinema display will too.
  3. Updated Mac Mini.  No one I’ve seen is mentioning how old the latest mini is.  The last update was almost 2 years ago.  It certainly can benefit from the updated Intel chipsets and Thunderbolt 2 connectors, not to mention that a lot of the Mac Pro tech could be used on the mini (1 TB SSD anyone?)  I’m wondering if Apple is doing a Mac Pro like make over on the mini. That would be REALLY interesting.  A Mac Pro with only one video card, maybe only two Thunderbolt 2 connectors, and only a few USB 3.0 connectors, and 1/2 the size of the Mac Pro…
  4. I don’t see a Retina Mac Book Air coming to light tomorrow.  The Air’s just got an update a month or so ago, even if it was just a small bump in specs, and a price drop.
  5. Something totally new???? WWDC has been interesting the last couple of years because Apple has been using this Apple-note to do the VERY high end products.  2 years ago, the retina Mac Book Pro.  Last year, the Mac Pro.
  6. I’ve seen someone pining for a 17″ retina MacBook Pro.  I have the last model year 17″ Macbook Pro, and the one thing it DOESN’T need is to double the resolution… 1920 x 1200 is bad enough!!!!  My eyesight isn’t that good anymore!!!! 🙂
  7. Please, no demo like the Anki drive tomorrow.  That whole thing was awkward and Microsoft-ish.  I think the Anki stuff is cool, but it was long, and not terribly pertinent to the Dev focus.

I have no real ideas about iOS 8 and OS X 10.10.  I’m resigned to the fact that OS X will probably get the iOS 7 flat treatment.  I bet that there will be a new Xcode release or announcement.  Apple is practically screaming ‘Developers, Developers, Developers’, so Xcode seems like a logical talking point.  The Healthbook rumors are getting interesting.  I hope that iBooks for iOS and OS X get updated.

At least tomorrow will be interesting!!!!

Solving a problem with TFS and building regular applications using TFSBuild

I’ve had a problem that has been driving me crazier than usual…  I’ve been trying to get the TFS Build process to create the ‘drop’ directories based upon the projects, rather than this huge glob directory of every file from every project.  Surprisingly, the default output of projects inside of TFS Build is to drop every file into one directory, EXCEPT for web sites.  What I’ve run into is needing to have applications and utility programs that are part of the solution and need to be deployed to be in their own directories when the build completes.  After spending a week of rather exotic solutions, including modifying the TFS Build definitions, writing all sorts of scripts, and looking at every package under the sun to solve the issue, I finally came across the CORRECT solution… tacking a property on to the MSBuild directive called GenerateProjectSpecificOutputFolder.  Setting that to true outputs to the per project directory structure.  This is EXACTLY what I’ve been looking for!  Thank you, Jason Stangroome for this WONDERFUL find!!!

Override the TFS Team Build OutDir property in .NET 4.5

 

Update… If you use the ReleaseTfvcTemplate.12.xaml from Release Management 2013 Update 2 client directory, the tokenization steps are missing.  Here is a link to a template that has the correct tokenization step, plus has the tokenization as a flag.  Cool stuff…

http://blogs.msdn.com/b/visualstudioalm/archive/2013/12/09/how-to-modify-the-build-process-template-to-use-the-option-trigger-release-from-build.aspx

Great tool for Recording!

This week, FabFilter is having a sale on their plug-ins.  All of their plug-ins are excellent tools, but I wanted to recommend two of them in particular.

The first is the Pro-Q EQ plug-in.  I know that all of the DAWS have good EQ plug-ins built in, but this one has some rather unique properties that make it worth having in the tool box.  One of the features that really helps me is the analyzer display.  There is a setting for Pre + Post analyzation.  This makes it super easy to see what you doing to the original sound coming in.  With the 26 parametric EQ points and the analyzer, the sounds can really be shaped in a very visual fashion.

Second plug-in from FabFilter that I like is the Pro-C Compressor (and by extension the Pro-L Limiter and Pro-MB Multi-band compressor).  Compression to most musicians is a black art… done right it seems to make things ‘better’, but done wrong can completely drive you crazy.  What I like about the Pro-C is that it shows you what it is doing as it is working.  By drawing a continuous line on the internal volume, one gets the ability to see things like the compression ratio and the release and attack.

Great plug-ins worth their full price, and even better on sale!

Line 6 2/3s a DreamStage Review…

Quick review:

I *finally* got my Line 6 L3T and L3S speakers and subwoofers.  I was able to pick up floor models / demos at a great price, and was finally able to bring them home for last weekend’s gig.  There has not been a lot of talk of these speakers, other than the initial release.  I’ve heard them before, though, and especially connected to the Line 6 StageScape, and they sound amazing.  They are loud, but not piercing loud.  Cheaper speakers seem to have a lot of extra high frequencies that these speakers do not have.  They are very full sounding without being over bearing.  I love how the speakers connect digitally to the mixing board.  One cable from the board to a speaker, then daisy changing all the way out, with complete assignability.

Why did I say 2/3s?  We still have regular monitors.  I’m keeping my eyes out for the L2Ms for monitors!

I’m looking forward to using these for a LONG time!

NAMM thoughts, and a gig…

So, this week was NAMM.  NAMM is one of the large music creation / production industry trade shows, held out in California.  Most of the products announced here will show up over the next 6 months to a year.  This is where all the new toys, goodies, and trends start.  After reading a lot of the forums and following as much of the press as possible, I just wanted to sum up what seemed interesting to ME with some editorial on my thoughts (Your Mileage May Vary)…

First off, true Thunderbolt audio interfaces started showing up.  Between Motu‘s 828x, Zoom’s new interface, and Universal Audio‘s Apollo Twin, true Thunderbolt audio interfaces are finally showing up.  There is one small problem will all of the new interfaces though… no pass through.  For me, that’s a deal breaker.  I have 2011 Macbook Pro, which has a Thunderbolt port that pulls double duty as a Thunderbolt to DVI connector.  Which, makes these new audio devices pretty useless for me right now.  Still, it’s GREAT to see the Thunderbolt port being used, and with the Apollo, because the device is also being used to off-load effects, the PCIe bus allows for a LOT of streams to be going back and forth.  I’m surprised that more of the audio interfaces haven’t moved to ‘true’ USB 3.0, as that would open up a lot of bandwidth for higher input track counts.

Second, more and more digital mixers are getting built.  This is kind of a two edged sword.  Personally, I LOVE the new digital mixers and am amazed at what the companies like QSC, Allen and Heath, and Behringer are putting into the price point of what the old basic Mackie boards go for.  The counterpoint to that is the old saying ‘with great power comes great responsibility’.  These mixers can do amazing things if you understand what needs to be done, and you have someone actually running the board.  The Line 6 StageScape mixers are pretty unique in that they focus on doing the mixing as a musician would, not a sound tech.  From what I’ve read about the QSC mixers, they have a similar idea, but not as ‘friendly’ as the StageScape is.  The thing is, you still need someone to run the boards to get an effective mix.  I’ve been moonlighting with my P.A. to mix a friend of mine’s band.  It always amazes me how much adding or removing just 1 or 2 db of sound changes the mix.  Having someone who knows when to push the instruments forward and to pull them back can make the night much more enjoyable for the entire room.  It’s unfortunate that the clubs really aren’t paying groups enough to actually pay a sound tech to run the shows.  It would make a HUGE difference in the performances.  Back to the original subject, the new mixers are getting more and more amazing with each iteration.  The Behringer X18 (was the X16) looks like the biggest winner, if and when they get it out to the public.  It was supposed to be released last year, but a redesign came about, and it certainly looks worth it.  This space is certainly going to get better and better.

Finally, there was some guitar stuff that was interesting.  Fender’s new Strat Deluxe Plus, with it’s easy part replacement plus personality cards look really cool.  Talk about a tweaker’s delight!  Pickups can be changed with no wiring to do, and the characteristics of the pickups, selector switch, and tone knobs can be changed by popping in personality cards.  Very cool.  Everything else was a bit ho-hum.  Line 6’s ‘amplifier redesign’ looks interesting, but there seemed to be some basic features left out, like direct outs, that leave me scratching my head at the market that they are going for.  It’s also interesting that they are using the older Pod X3 technology, rather than the newer HD technology.  One company that wasn’t at NAMM, but had a really interesting beta release is the Fractal Audio Axe-FX II.  The guy who designs and builds those things is crazy about capturing the nuances of a tube amplifier in a DSP, and he is close, if not having already surpassed a good tube amp.  The Axe is on my ‘next to buy’ list.

So, how does this tie into playing a gig this weekend?

The music instrument business seems to be going into two different directions.  You either have insanely specialized equipment that is VERY expensive, or you have things that ‘do everything’ at rock bottom prices with lots of trade-offs.  The middle ground equipment seems to be getting lost.  I’ll go to the Line 6 example.  The new amp is interesting because it’s a digital amp that is controlled via bluetooth with an iPad or iPhone, but it still has some basic controls on the amp itself.  The power wattage is enough to play live, and the price is fairly inexpensive, but I doubt that many people will use them live.  There’s just too much that has to go right to get it to sound good.  Not putting in a line out takes away a lot of the usability.  Heck, I just ran a show for someone who had an old Line 6 Flextone III, and we used the XLR out directly to the board to get a great sound.  I guess it just feels like the companies are really focused on the bedroom musician, and not the performing musician.  Amps are either 100 watt tube monsters or 10 watt recording amps.  Same thing with the guitars, great quality instruments are either insanely expensive or not up to gigging standards.

And, the million dollar question is… do we actually NEED a lot of this stuff?  For my gig this weekend, my power supply that I use to run my pedals went haywire.  I couldn’t run my entire board like I normally do.  Fortunately, I had a backup power supply, but it only allowed me to run my most important pedals. (which were my wah, tuner, compressor, and noise gate) .  I ended up having a pretty stripped down sound, and you know what?  That was all I really needed to play for 4 sets.  (It’s fortunate that we don’t play too many ‘effect-y’ songs).

Still, it’s a great time to be a musician!  Lot’s of great tools to help produce better and better quality music!

Wow… 9 years already

I knew that I started blogging in December… just didn’t realize it was over 9 years ago!!!!

What’s amazing is how much has changed in those nine years…

  • Homeowner
  • No more Corvettes, until I have enough money to actually keep one serviced correctly 🙂
  • Have stayed at my current job for almost 7 years
  • Have switched from Dev to DevOps
  • Have switched from PCs to Macs
  • Have actually been playing music, recording, mixing, and running sound for people rather than being academic
  • Cat changes (please do not ask me about it)
  • No longer a PRS only guitarist 🙂
  • Switched from flip phones to iPhone
  • Mainly use iPad around the house instead of a computer
  • Went from 1.5 meg DSL to 50+ meg cable
  • Went from a single CRT TV to two LED TVs
  • Gained weight, lost it, and then found it (and it brought friends)
  • Have performed more than 200 times for people during the time frame

To say the least, it’s been an amazing ride.  Looking forward to year 10, as this is a year with lots changing, and lots for me to write about.

A decent Christmas for once

Good day today.  Since Christmas is probably one of only 3 days in the year that I can almost guarentee no support calls, I decided to do a quick one day trip to Walt Disney World.  Living in South Florida, playing tourist is a LOT easier here than it is for most people.  Unfortunately, I almost never do.  I’ve been thinking about a Disney Christmas for many years, but this year was the one that I finally got off my rear and did it.

And ya know what?

I had a blast!  Heck, I didn’t even ride any major rides.  I just got out, and spent a lot of time walking and just enjoying the day.  So, no rant about prices, lines, times, food, general rudeness, etc… All definitely could rile me up, but today was about relaxing and enjoying life.  I was lucky that I was able to even make it into the park, they hit capacity by 11 a.m. (I got into the park around 10:00).  Plus, having to focus on ‘dealing’ with it helped me be a lot more cheerful than I usually am during the holidays.

Now, if one is going there to ‘do Disney’, all I can say is Christmas is NOT the time.  Check the internet for best times to go, and follow that!  If you just want to enjoy some serious people watching, and can take the nuttiness, Christmas can be a LOT of fun.

New iPad Air day! And some thoughts on the latest Apple software

Wow… In case you missed Apple’s event last week, then you missed a TON of announcements.  Instead of the event being iPad focused with little info about the Macs, it was completely the opposite.  The event was 2/3rds Mac, and 1/3 iPad.  Quick summary:

  • More info about the Mac Pro
  • Update Mac Book Pros and Mac Book Airs
  • OS X 10.9 Mavericks released, free
  • iWork updated on both OS X and iOS, all free
  • iLife update on both OS X and iOS, all free
  • new full sized iPad, called iPad Air, shipping Nov 1st
  • new iPad mini, now with Retina, shipping sometime in Nov.

Being the stupid, bleeding edge person that I am, I almost immediately installed Mavericks, iWork, and iLife (plus the small update of Logic).  My initial first impressions are very positive.  The Mavericks upgrade is a no brainer.  Unless one has the initial 32-bit Intel processors (the absolute first Intel macs), Mavericks will install on a machine.

The good:  So far, the OS part of Mavericks is awesome.  The boot time on my MacBook Pro decreased significantly.  I don’t know if that’s because Mavericks has a better handle on my SSD, or if some other voodoo is going on, but the boot up, and general responsiveness of my computer seems to have gotten better.  Unfortunately, I don’t have benchmarks, but ‘it feels faster’ works for me.

The bad: I’m coming to grips with the changes in applications.  I’m one of the people who like the skeumorphic apps like the calendar, notepad, and contacts.  The new apps are a bit bland.  Sometimes, skeumorphisim really works.  In my Digital Audio Workstations, plug-ins that look like their real counterpart seem to be much more accepted than ones that have abstract interfaces.

Then there is some ugly:  The multi monitor implementation might work for most, but I hate it.  I’m just glad that there is a check box allowing one to go back to the ‘old’ screen system.  iBooks is another issue.  Did anyone actually try to READ a book on an mac with the iBooks program?  Trying to get a large, single page view is impossible.  I expect iBooks for Mac is why iBooks for iOS didn’t get updated.  The iBooks team has probably been feverishly working on iBooks for OS X, and had to rip out the features in the program to get it to ship.  Hopefully, iBooks will get updated a LOT.

As for the other apps, I don’t use Pages, Numbers, Keynote, iMovie, and Garageband that much.  Garageband is more and more ‘Logic Lite’ as the latest Logic features, like Drummer, are showing up in it.  The other apps seem to have regressed a bit, as lots of features have been pulled to allow the apps to get a solid baseline.  I expect the extra features will come back, but only when  the feature can be implemented across OS X, iOS, and the web sites.

All in all, the hardware and software are great update, and show that Apple isn’t only focusing on the phones and tablets.

 

I had also told myself before the presentation that my current iPad, the 3, was ‘good enough’.  Well, that’s until the iPad Air was announced.  A lot of people have been calling the iPad Air a ‘boring’ release.  As for me, once the iPad Air’s information showed up, I decided to sell the iPad 3, AND my iPad mini to fund an update to the Air.  What makes this release not boring TO ME was several things: one, the weight dropped by almost 30%.  The new Air falls squarely in the middle of an iPad 4 and an iPad mini for weight.  At the same time, the reduced the size of it and changed it to be more ‘iPad mini’ looking.  I’m undecided on if I like the shrinking of the size.  On one hand, one can use the Air one handed.  But for someone like me who uses a stylus a bit, the edges were exactly right.  Now, there’s no where to rest one’s wrist.  Next up, is the processor upgrade.  This one is huge.  The iPad 3 has always been a bit lag-y.  I’m sure the 4 helped, but I’d skipped that version.  Still, the new processor is MILES ahead of even the 4.  Plus, the 64-bit change over makes the processor a very interesting upgrade.  And finally, 128 gigs of memory.  Yes, this was there for the 4, but the extra space was not enough to cause me to go out and upgrade from the 3 to the 4.  However, when you put all of the features together, the iPad Air became a no brainer.

So far, so good.  Apple is executing well, and I’m personally very happy with all the new toys!

 

What’s this? A GOOD week for Microsoft?

Ok, I admit it… I’m an Apple snob.  Apple has been firing on all cylinders since the launch of OS X 10.4, and hasn’t appeared to be slowing down.  The switch to Intel drew me in, and I haven’t looked back.  For work though, I live in a different world.  I’m fortunate to work at a company that allows me to have a Mac Pro desktop, a Macbook Pro laptop, and an iPhone; iPhones and iPads rule the roost for phones and tablets.  With Parallels, VMWare, RDP, and Back to My Mac, I can live in both worlds and be VERY happy.  So, I do still keep up with Microsoft.

This week was a big week for the Evil Empire (Microsoft, not Apple!).  Lots of goodies came out:

  1. Windows 8.1 – This goes a LONG way to fixing the absolute nuclear disaster that Windows 8 is.  8.1 doesn’t fix everything, but it does fix a lot.  It’s amazing that Microsoft realized how bad 8 was and worked quickly to resolve.  Even better, 8.1 is a free, IN PLACE upgrade to Windows 8.  Microsoft is finally learning how to do in place updates.  Hopefully the days of reinstalling everything for the Windows crowd will soon be behind them.  Also, Windows 8.1 is a more efficient OS.  On the Virtual Machines that I run at work and home, Win 8.1 is MUCH faster than 7.  My guess is that the reduced video card requirements help out in that area.
  2. Visual Studio 2013 – This update is a lot bigger than most people realize.  Visual Studio 2012 had the same flaw that Windows 8 had… it completely failed at doing the job it was supposed to do.  The .NET 4.5 framework has gotten better and better, but VS 2012 made developing for it truly horrible.  VS 2013 fixed a LOT of the issues.  The Team Foundation part of VS 2013 is very usable, and the new IDE tools in 2013 make programs like Resharper and CodeRush not so much requirements any more.  Plus, the Database tools have come back, and are better than ever.  The loss of the scripting engine in the IDE still hasn’t been addressed, and it’s doubtful that it ever will come back, unless it is as Powershell.
  3. Remote Desktop Client for non-Windows platforms – Microsoft released Remote Desktop clients for Android and iOS, plus did a SIGNIFICANT upgrade to the Mac desktop client.  It’s funny, almost all the news sites have talked about the Android / iOS client, but none have reviewed the OS X client.  It’s not perfect, but it certainly addresses a LOT of features that have been missing.  Being able to use the Remote Desktop with a remote desktop gateway and true multi-monitor support have been great.

Hopefully, this represents a new direction from Microsoft.  Many of their products have finally matured to the point of going from ‘it sorta works’ to ‘I love working with it’.  Some of the examples are: Outlook.com receiving SMTP finally, which makes using the service with a non-Microsoft email client useful (deletes and reads are global!!! Horrray!)  Skydrive rocks.  Azure being competitive.  Lots of little things across the board that just seem to finally be coming together.

Good job Microsoft!

I gigged a Tele last night (and I think I liked it!)

The ‘year of gear’ is finally winding down.  I’ve been on an absolute tear with guitars, recording equipment, and PA equipment.  But, that’s not really what this post is about.

This post is about finally understanding some of the physics of getting a great sound in a band setting.

This one starts with me watching some videos on how to play guitar in a church setting.  The instructor is playing a Fender Telecaster Deluxe, which is basically a noiseless version of the standard Telecaster (Tele for short).  He’s getting some great tones with a very basic amp and pedal setup.  In the back of my mind, I file the Tele Deluxe as a ‘cool guitar to keep an eye out for’.  They are not terribly expensive, when compared to my PRS guitars, but they aren’t free, either.  So, I’ve been keeping an eye out for new or used, but I haven’t come across anything that fits.  I’ve played new ones that I didn’t like, and most used ones weren’t really there, either.  I tried non deluxe models, but I like the thinner neck of the Deluxe, and I REALLY like the noiseless of the pickups, even if they don’t sound as twangy.

Well, about a month ago, someone I work with, whom is also a guitar aficionado, tells me about trying out a used Tele at the GC near where we work.  I *drag* it out of him that it’s a Tele deluxe like I’m looking for, at a great price because it is used.  One call later, I confirm it’s what I’m looking for,  so off to GC at lunch for a test drive.  I was blown away by this guitar!  Plugged into an Orange Rockerver 1/2 stack, this guitar just JUMPED.  The action on the strings was set pretty low, which is where I like it.  The guitar just had some great authority.  It wasn’t the twangiest of tele sounds, it was a bit smoother, but still had some great bite.  I had to wait until GC released it (the guitar had literally been traded in two or three days before, and they wait 30 days to validate equipment isn’t stolen), but last week I was able to finally pick it up.

I really liked the feel of the guitar, so I went ahead and put new strings on it, and took it to my gig on Saturday night.  I had two PRS as backup, and just wanted to hear how the Tele sounded through my rig, and see if it would work.  Well… I ended up playing it for three sets.  I’d have played it the entire night, except the screw holding the strap nob seemed to come out.  I don’t know if it was that way before, but it was kinda weird.  Tightened the screw up when I got home; I’ll have to see if it needs a repair.  The REASON I played it for the whole night?  The sound.  The Tele complimented my band REALLY well.  We play a mix of classic, grunge, and modern danceable rock.  Everything from Steppenwolf to My Darkest Days, with some country thrown in.  This guitar handled it all, and handled it incredibly well.  What I loved was how it sat in the mix.  I was able to hear myself without needing to turn up.  Also, I loved how I could switch from chords to riffs, and the volume level didn’t drop off or jump up.  The guitar played nice with all my pedals, and I was actually able to use the pedals to really get different tones.  I had picked up a great compressor recently, but with my other guitars, it doesn’t really make a huge difference.  With the Tele, it fattened the sound up, without being in your face.  Next gig is this Friday, will have to see how it works then.

Will I get rid of my PRS?  LOL… Umm. no.  The guitars I currently have are all very amazing instruments, each with a different voice and reason to play.  But, I will test the Tele again this week… I think I like it!

Jammin’

This is kind of a zen post, a bit of train of thought stuff, but I wanted to capture the moment…

I picked up an interesting guitar book recently, Introduction to Jazz Guitar Soloing.  In the first couple of pages, the author made an interesting statement, ‘Most key centered solos can be compared to pointless conversation’, and that really hit home.

Ok, gotta back up and explain what that means before moving forward…

When playing a solo over chord changes, there is usually a specific ‘key’ of notes that can be used over the whole chord progression.  One can play any note in the key, and it will work with the overall tonality of the chords, and sound OK.  Personally, I don’t think that there is much wrong with this, but it amounts to ‘noodling’ (something I do a fair amount of! 🙂 )

What the author is point out is that without some idea of where you are going, or what you are trying to achieve with a solo, it can be just kind of a ‘random note’ thing.  Of course, he has his ideas of how to not have ‘pointless conversation’, which is interesting to absorb.

One of the first exercises in the book is to play over a progression, and record it.  This is great practice, as it gives one a better idea of how one REALLY sounds 🙂  Tonight, I decided to create my own backing track to play over for the exercise.  I decided to play around with Logic X, as the new drummer piece is incredible for doing good backing tracks.  I recorded a couple of bass lines to add some basic definition, and then proceeded to jam out on the guitar.  I didn’t even dial in a fancy sound or anything, just used Amplitube’s default patch of a nice clean sound.  Wow, did I have some fun.  My little backing track had enough dynamics for me to switch up a bit on, and I tried a bit of the different techniques from the book.  That turned into some very cool stuff.  I haven’t played like this in a while, and, boy, did it feel good to have some nice melody and dynamics, and it was pretty much effortless.

It just all comes back around to having fun making music.

Ok, Zen off… 🙂

Correction on something I said about Logic X

I have a correction to make to one of my posts about Logic X.  In this blog post, https://dscheidt.wordpress.com/2013/07/22/decisions-decisions-decisions-personal-daw-comparison-of-logic-x-studio-one-and-pro-tools-11-with-industry-thought/ I incorrectly stated that Logic X only did strip silence destructively on the audio file.  That is incorrect.  There is a ‘Strip Silence’ command that works on the clip in the Arrange view.  The menu item is hidden on the toolbar.  That is exactly what I was looking for.

Adventures in Guitar Amp Modeling, Amplitube style

Today, IK Multimedia released an update to Amplitube.  Ok, so they’ve been doing this for a long time, why is today a big deal?  Well, 2 things… first off, the latest version is compatible with ProTools 11.  But that’s not what this article is about 🙂  The second thing is that today they added a model of a pedal that I actually own, the Fulltone OCD pedal.  Again you ask, why is that interesting.   Because, this is the first time I can personally compare the program / model to the real thing, as I have a Fulltone OCD on my pedal board.  Long and short of it is that the model is excellent, works like the real thing.  VERY impressed.

So, why the blog post, then?

One thing that I’ve seen is that guitar rig modeling has gotten a bad rap over the years.  There have been some very crappy pieces of software that have deserved to be ridiculed.  In fact, Amplitube was one of them!  There have been lots of pieces of software that have sounded hideous.  And, probably every one of them is somewhere on a ‘professional’ recording 🙂  Whaaaaa!?!   How’s that?  Well, the current modeling software can sound darn good, but there are some things that one needs to do to get that good sound.

1.  Know what sound you want.  This is important.  Certain modeling software does certain things.  Check the software to see if it has the amp you are looking for.  Looking for high gain?  or low crunch?  Some programs are stronger than others.  Can you live with it inside of a DAW?  New DAW software has some great plug-ins, but they only work in each particular program.

2.  Accept that it’s not going to be the ‘real amp’.  This is what hangs most guitarists up.  Tubes react a certain way.  No matter how much is tried, digital circuits don’t feel the same.  Keyboard players have had to deal with this for years.  NOTHING sounds or feels like a grand piano, yet every keyboard tries to get there.  It’s just a lot easier to carry today’s keyboards than it is to lug a baby grand.  Guitarists are headed in the same direction.  Turning up amps to 11 has gone the way of the dodo in most places.

3.  Know how the real hardware works!  This is one of the reasons I wrote this post.  The current rig modeling is getting things EXACTLY right.  Which means, you have to know how the things that are being modeled really work.  Take the OCD pedal that came out today.  One thing on the real pedal that I had to learn was that unity gain on the pedal (where the signal out of the pedal isn’t louder with the pedal on than when it it off) on the level knob is as somewhere around 9 o’clock.  The convention on pedals is that the 12 o’clock position is unity.  Well, the model has it exactly right.  Unity on the pedal model is at 9 o’clock.  That can definitely negatively affect the sound, and I would not have known that unless I’d used the real thing (or experimented a lot).  So understanding what is being modeled, and knowing how to make that sound good definitely helps.

4.  Good hardware.  This starts with a good guitar, good cables, a good interface, good computer, good software, and good speakers.  Skimp on any one of those, and the sound is going to be horrible.  I recently changed my Line 6 Toneport UX8 audio interface out for an MBox Pro 3, my M-Audio BX-5 speakers for Mackie 824mk2s, and my PRS guitars out for a Les Paul.  (Aside, PRS are still my beloved #1s, but I found an awesome Les Paul that gets some playing time).  I’ve switched between several Amp / Rig simulator programs / plug-ins, and I’ve settled on Amplitube currently.  These changes have made an enormous difference in the sound.  It sounded OK before the changes.  Now, my guitar tone at low volumes is great, very much what I am looking for.

5.  Guitar cabinet simulations are important.  Impulse Response (IR for short) cabinet models go a LONG way to making a modeler sound good.  Also, modeling room points help.  One trick I use with Amplitube is the room mics on the cabinet.  Amplitube has 4 sound points.  You can have two mics on the cabinet, plus two ‘room mics’.  Mixing the room into the output can really open up the sound, and take away the digital harshness that modelers are known for.

6.  Don’t be afraid to throw what anyone else says out the window.  The number one rule is that ‘if it sounds good, it is good’.  Experimenting with modelers can be very interesting.  Find a sound you like?  Save the settings, and then it’s easy to get back.

All in all, there’s a lot of great programs out there!

Got Logic X? Got a Faderport? Wanna know how to make them work together?

Talk about poor communications…

Over two years ago, Logic 9.1 went 64-bit optional.  The Presonus Faderport only had a 32-bit configuration file for Logic.  So, for over two years, I basically put Logic on the shelf, as the time to move to 64-bit was then.  Studio One supported 64-bit and the Faderport, so it made no sense why Logic would not have the same abilities.  It is, after all, just a MIDI device.  So, after two years, no driver updates, no install package updates for the Faderport, and no communication on the message board.  Cue Logic X.  Not only is it pretty awesome, it happens to be 64-bit ONLY.  So, the Faderport is useless, right?  I even wrote on the Presonus forums asking about the compatibility between Logic X and the Faderpoart.  No answer.  That board makes most graveyards seem jumping.  So, was perusing the forum tonight, and noticed someone had made a post to a thread from 2010 about Logic.  In the new message to the topic, someone mentioned that there was a Presonus recompiled bundle for 64-bit Logic!!!!  Hot DAYUM!!!!  Click on the link, and sure enough, it’s all legit.  Pull the bundle in and voila!  Logic X and the FaderPort are best buddies!  Here’s the perverse part… the following tech note has been available SINCE AUGUST 2011!!!!!  WTH!?!  There’s no sticky on the forum, have never seen anyone say ‘boo’ about it.  This is crazy!  The answer has been on Presonus’ site for YEARS, and no one has pointed it out.  Crazy!

Well, if anyone else needs it, here’s the link:

http://support.presonus.com/entries/20399677-faderport-logic-64-bit

Edit:

It appears that Presonus updated the page with worse instructions than before…  To install the bundle that is attached to the link, start with Logic Pro X as the application folder to ‘Show Package Contents’.  Drag the Faderport bundle into the Contents -> Midi Devices Plug-ins folder.  Start Logic at that point, and the Faderport should just work.