Wednesday, May 28, 2008

Production Deployment

We started our production deployment here on May 12, and all in all I am pretty happy. There have been a couple "gotchas", but after about a year of working on this thing I have to say it has went very smoothly. One of our biggest issues here is bandwidth. Going from XP to Vista our deployment has, in most instances, trippled in size. With XP we were about 10G in size, and with Vista we are at about 30G. The problem with that is that our network infrastructure is inconsistent at best. There is nothing I can do about that and so that's the limitation that I have to deal with.

The first deployment that we did we did on an 8-port (that's what we had at the time) Gbit switch. That allowed us to do six machines at a time. Two ports were used by the net connection for windows updates and the nas for building from. Those six machines took three hours. Next we put twelve machines on a 100Mb switch. Those machines took four and a half hours. That evening we plugged the nas into the wall and started the rest of our machines (about 25). Sorry I don't have exact figures on this, but by the next morning several machines failed asking for the CD, I think 3 just failed with some deployment error, and 4-6 were still building.

So after looking at this I tried to think of different things to try to get things done faster. First off, I have to say that I am pretty happy with the loading on our nas. I don't know how well it should load, but I know that we were never able to load our old server pulling down ghost images like we are able to do on the nas (If we got more than 5 or 6 ghost sessions going some would start to fail). We are using a Qnap TS-409 Pro nas. One of the things that I thought about doing is copying the application install files down to the local machine and then have the task sequence fire them off the local hdd. Currently the installs are fired off the server/nas. I don't know if that would work in our environment because until the past year we always got the smallest hdd we could from the vendor because we did not really need any bigger drives. I also thought about other temporary network topologies, but nothing really stands out as a solution. So in the end we just opted for a 24-port Gbit switch and let em grind.

The biggest time hog for deployment/installing is Visual Studio 2008. The old version took forever to install. I think this one takes twice as long to install! This install just drives me insane. I have sat and watched the resource monitor for this install and it is just slow. First off the it looks like msiexec.exe needs to be ported over to being a multithreaded app. For most all of the VS install the processor is pegged right at 50%. (I've tried looking to see if there is any documentation on Windows Installer being either single threaded or multithreaded and I could not find anything, so I am assuming it is single threaded.) Office 2007 and CS3 take some time too, but they seem to be a fair amount better about dealing with network/cpu/hdd/memory usage for their installs. VS just seems to have a one track mind though. It is either using network or cpu or hdd, but when one spikes all the others fall off.

As for any errors that I've seen the most common one is a program give me an exit code of 13 and I cannot find a way for the deployment wizzard to accept that return code as ok. Other machines have failed up doing windows updates, which to me is not a show stopper. They will get their updates soon enough. One or two errors happened that caused us to just wipe the drive and start over. I don't remember the exact errors, but they were pretty odd ones about installing Vista and wiping the hdd did the trick.

Thursday, May 8, 2008

Quick and dirty "spare" deployment point

In our environment we have some "problem" labs that we have to deploy Vista into. For one reason or another the network will not really handle deploying several machines at a time. Trying to remedy this solution I got us a nas to work off of. This presents some interesting challenges.

The deployment tool isn't really setup to handle a dual server type setup very well (if at all). I might be missing something that would make this really easy, but I don't think so. Other blogs I have read talk about dns round-robbin and load balancing and the like, but none of those scenarios really address the problems that I face -- slow connections to the server room, either because of a wan-link or a flakey network. At any rate I got us a small nas so that I would have a local resource we could build from in any lab.

What I did was create shares on the nas that are named the same as the ones that are on our primary server. I use two different shares for the deployment and the applications. I then copied all the files over from each share to the corresponding one I made on the nas. After doing this I had to modify a few things to point to the nas.

One of the first things that I had to modify was the bootstrap.ini file to point to my nas -- but that was easier said than done. There might be an easier way to do it than what I put here, but this is what I did. I took the LTI iso and mounted it so I could read/write to it. I then had to open the boot.wim in the Sources folder that is in the iso (for the time being I'm going to leave it to your google-foo to figure that out -- if anyone asks I'll write that up later). I edited the bootstrap.ini in there and saved all of it back out and burned a boot cd.

From there I had to modify serveral files on the nas to point to the nas:

\Control\Applications.xml
and
\Control\*\TS.xml

* is the task sequence number of any task sequence you might use from the nas

I did all of this outside of the deployment tool because I did not want to make two deployment points and I didn't want to duplicate task sequences and have to edit both of them for each change I make (well I have to edit it anyway, but I would rather use a txt editor to do a search and replace rather than using the task sequence GUI).

That's my quick and dirty way to use another server to deploy from. Good luck.