By Mike O’Connor
I’ve talked with a lot of people lately who are shying away from the practice of using disk imaging or cloning programs as their distribution system for end users. If you’re still doing this practice, read on.
In the mid-1990’s, tools came out to help with creating disk images so an organization could copy the disk of one machine and use this as the basis for another machine. As time went on, this practice of including the pre-built OS with company-specific settings also included the most common applications that employees need to use.
The problem with that approach is that as patches, updates, and new versions of a piece of software come out, there isn’t a way through imaging to get that update to the end user. You could do one-at-a-time “Sneaker-net” approach to get apps installed on an end user’s machine via manual installs, or IT could build a new image with the new version of the app “baked in”. However, this mostly helps new users and not existing users. Existing users would have to save their documents and settings somewhere else before getting this new image delivered, and then a process of restoring these documents and settings would have to take place. Plus, if you have multiple departments that have different apps, you have to build different images and manage them for these different groups. So what used to make the job easier is now becoming increasing difficult to manage.
Instead, more and more organizations are going to an approach of using disk imaging software to create the core OS image that everyone gets. Applications are re-packaged via a packaging solution that gets them into the Windows Installer (MSI) format (if they aren’t already into that format). Then, the IT group uses a distribution system to push these applications out to end users that need it. This reduces the requirement to manage multiple images for different groups and allows the flexibility to push out only the apps that an end user needs. Plus, as new patches/updates/releases are available, they can be pushed out on-demand via the distribution system without a new disk image needing to being created. Imaging can be done only when updates with major service packs and the OS come out.
A distribution system can be something simple like Group Policies, or more robust, offering additional features (many software providers have solutions that can help with this rollout). Some distribution systems even include the ability to build the base OS image and help push this out to “bare-metal” machines (PCs with nothing, not even an OS, installed on it). And there are packaging solutions for creating MSI packages to more advance solutions that can help with Windows and application compatibility, as well as testing and creating virtual applications.
It may seem like creating a disk image with both OS and apps built-in into it is the easiest and fastest way to go. However, updates and changes to those apps happen more often than anticipated, and the complexity of pushing those updates through imaging quickly becomes apparent. By keeping the base OS and apps separate and using one of the many methods to distribute apps, you’ll end up saving more time and gain more flexibility in the long run—and I know everyone in IT could use more time to get other things done!