Thursday, December 31, 2009

Software Deployment: Old vs New

I’m pumped on some really kick-ass strong coffee right now, so I may be typing way ahead of my brain, but I can’t be sure.  Anyhow, here goes…

Part of (or most of, depending upon whom you ask) my “day job” revolves around packaging software for mass deployment.  What does that mean? (you ask).  In layman’s terms it means we take a commercial software product (not always, but usually) and wrap it with some programming code (aka “script”) to make it so it will install without prompting for any input or clicking any “next” buttons along the way.  We do more than that also.  We remove or create shortcuts, we copy additional files and modify registry keys, etc.   Anything we need to do so that it installs in a way that suits the needs of our customer.  Our customer is a huge, 10,000 headed beast that beats on computers with four hands and never looks at the keyboard.  So we have to be sure we nail everything down to weather the storm ahead of time.

The Old Way: Package and Install

Because we’re pushing out unattended installations, we have to take enough time early on to predict problems, scenarios, and build in solutions to handle them without any eyeballs watching it “go”.  We find out later how it went by collecting log files and looking at reports.  It’s a laborous process and involves following a meticulous path from start to finish. 

Questions that need to be addressed along the way:

  • Is this a new install or an upgrade?
  • Does it depend on any other products or components?
  • Are there any previous versions on the clients?
  • Do previous versions need to be removed before running the install?
  • Are there issues with machine vs user settings?
  • Are there pre-requisite issues like JRE, .NET, etc. to worry about?
  • Does it install with an MSI package or older Setup.exe garbage?
  • Does the vendor even support an unattended installation?
  • What system requirements need to be validated?
  • What conflicts are known with other products or components?
  • Does the product create or rely on any services running?
  • How does the product behave on first-launch for a limited-rights user?
  • Does it run properly with a limited-rights user?

And this is only dealing with the installation.  We also have to build in the logic and code to handle a requested uninstall.  That involves other steps:

  • How does the product handle uninstalls?  MSI or Setup.exe?
  • What is left behind? Folders? Files? Registry keys? Services?
  • Does it leave behind user-profile components that need to be removed?

So our installation code might typically follow this logic path:

  1. Check user rights
  2. Check system requirements
  3. Check for existing previous versions
  4. Remove (and clean-up) existing previous versions
  5. Install new version
  6. Configure, Tweak, Adjust
  7. Request a new client inventory scan
  8. Exit

Each and every step along the way involves a lot of code and a lot of log file output.  The log file output is necessary to support deployment assessment and troubleshooting.

When the product is replacing a long ancestry of previous versions, the code can get quite ugly and long, as it has to check for each and every version and most often approaches the removal and clean-up in unique ways for each version.

The New Way – Application Virtualization

Before I get started, let me say that App-Virt is not a panacea.  It will not work with every single software product on the market.  But it works with the majority of them, with caveates of course.  Since each vendor has their own unique way of smoking crack and snorting acid as they write their product code and (even worse) shooting up heroin and peanut butter as they develop their installation packaging.  Some examples I like to toss out are Adobe and Oracle.  For whatever reason, these two come to mind as being some of the worst writers of installation packaging code, but whatever, I digress.

Because of how (most) application virtualization technology works, most of the headaches involved with the “old way” are simple non-existent.  There are still some issues to deal with, and it varies by product and vendor, but the level of effort is typically much lower and the consistency of reliability is typically much higher, so I’d call anyone who dismisses the technology without actually having put it to use is simply a complete fucking idiot.  But alas, I digress again. :)

I will summarize application virtualization as follows (mainly because there’s already 40 bazillion web sites and articles that explain it better and in more detail):

It places an abstraction layer between the application and the operating system environment.  It doesn’t even matter if the operating system environment is “physical” or “virtual”.  Anywhere that the application wants to reach out and write to, or read from, the registry, a system folder, etc. it is intercepted by the abstraction layer and everything is fine.  Even though, under the hood, in a dark alley filled with smoke and bad guys, the requests are simply redirected to safer places which do not impact or even rub up against the real host system.  So the application wants to read or write to something in HKEY_LOCAL_MACHINE, or C:\WINDOWS\SYSTEM32 or \PROGRAM FILES\ or whatever, the AppVirt layer redirects that to files within the user’s profile.  The entire application is housed in a logical bubble that remains intact and cohesive yet has no direct impact on the host system.

What does this mean?  What does this solve?

  • The new application doesn’t “see” or even care about any previous versions of the application.
  • The application might require Administrator rights but works fine even when launched by a user with limited rights
  • The installation has ZERO impact on the host
  • The removal has ZERO impact on the host
  • There is much less (usually ZERO) effort involved with chasing down conflicts, versions, clean-up, etc.
  • Removals are simple, clean, effortless
  • Upgrades are easier than picking your nose (assuming you have a normal nose, and hands with working fingers)

What does a typical “packaging” scenario look like then?

Well, first off, they don’t always call it “packaging”.  Some vendors assign different names, such as “sequencing” and so on.  But the process works much like the older “capture” process used for older Wise install Setup.exe packaging processes.

Put simply, you start a capture process on a clean “reference” computer.  You run your product installation, perform any after-install tweaks, clean-ups, and so on.  Then you “end the capture” and it rolls up a sequence of what changed from the initial capture point.  It then offers you a logical dialog interface in which to make additional adjustments and then you save it and post it for deployment.  The deployment process varies from one vendor to another. 

For example, Microsoft App-V (formerly SoftGrid) requires pre-deploying a client agent that allows you to “stream” new product packages to clients on a bit-by-bit level, using RTS or RTSP (for secure, over the web deployments), or you can wrap the package in an MSI and deploy it using traditional means like scripts, Group Policy, SMS/SCCM/Altiris/Tivoli/whatever. 

VMware ThinApp (formerly Thinstall) bundles the client agent inside the package itself, making it completely portable.  So you can deploy it using file shares, web pages (HTTP or SSL for web deployments), Group Policy, scripts, etc.

There are others and each provides additional features and caveates to address deployment issues such as bandwidth concerns, restarted deployments (think laptops), access control (Active Directory, etc.), versioning and so on.

So in the case of App-V using streaming:

  • Launch a clean reference computer (I prefer within VMware)
  • Launch the App-V sequence capture
  • Launch the vendor software product installation
  • Run the product
  • Adjust settings, modify files, folders, registry, etc.
  • End the sequence capture
  • Adjust the package in the sequencer utility
  • Post the package to your streaming server
  • Clients request the package and stream it to their desktop

Using ThinApp:

  • Launch a clean reference computer
  • Launch the ThinApp capture process
  • Launch the vendor software product installation
  • Run the product
  • Adjust settings, modify files, folders, registry, etc.
  • End the capture
  • Adjust the package in the capture utility
  • Save the package and post it to a file share or web folder

The Deployment

Let’s say you have Windows Server 2008, even on one domain controller, so you now have Group Policy Preferences in your environment.  Use it.  Configure a GPO to place a shortcut on the user’s desktop or start menu.  They click the shortcut to a ThinApp package which resides on a shared folder or web folder.  It streams down in the background and launches.

But what if you don’t have Windows Server 2008 and don’t have GPP?  No problem, you can place the shortcut, or even download the entire application package, using scripts, Group Policy or other tools at your disposal like SMS, System Center ConfigMgr, Tivoli, Altiris, etc.

Updates

So now you need to deploy “service pack 1” to all these clients?

No problem. 

  • Open the package within the package sequencer/editor
  • Install the service pack
  • End the capture
  • Increment the package “version” number
  • Post it back on the deployment share

Clients automatically check for a newer version number each time the package is launched.  If the source location has a newer version number, it’s automatically downloaded and cached and launched.  User now has SP-1.

Summary

The biggest differences between traditional scripted/packaged deployments and virtualized deployments, is that the latter provides:

  • Shorter preparation and deployment cycle
  • Shorter upgrade and update deployment cycle
  • Shorter removal/clean-up time
  • Reduce or Eliminate application conflicts
  • Reduce or Eliminate user rights “holes” for admin-only apps
  • Running otherwise incompatible products on the same desktop (at the same time)

Caveates

Nothing is perfect.  Not even Kim Kardashian.  The only “downside” to most of the App-Virt products is that they’re aimed (and priced) at larger customers.  I’m talking large enough to where you actually use Microsoft SELECT or EA licensing (even if you plan on using VMware products, it’s about the scale of your environment).  You can look at the pricing yourself and see what I’m talking about.  It’s cool enough for a shop with only 30 desktops or laptops to support, but may not be worth the price tag until you get to a much larger number.  It’s one of the peaves I have with Microsoft and EMC/VMware.

No comments: