It's been a long time since I've posted on here. We have our deployment working and it's been stable for about a year and a half now I guess. The new MDT 2010 is out now. I never looked at any of the betas for it because I did not want to disturb my current deployment point. I recently downloaded the 2010 (RTM?) and looked at it and it is going to take me a little bit to digest it.
The first thing that I can say about it is that I cannot wait to play with it. The MS folks have added quite a bit of nice features it looks like to me. I wasn't bright enough to understand how to setup one deployment point and have it be a master for some other ones (for lack of a better term). It appears that is easy in 2010 (and was supposedly in earlier ones, but not easy). Following that same way of doing things having a dev and prod deployment looks to be tons easier now too.
There are lots of improvements so if you haven't looked at MDT 2010 you really should take a peek. As I get to start working with it over the next little bit I will get back to updating this blog with the things that I figure out.
Friday, September 18, 2009
Wednesday, May 28, 2008
Production Deployment
We started our production deployment here on May 12, and all in all I am pretty happy. There have been a couple "gotchas", but after about a year of working on this thing I have to say it has went very smoothly. One of our biggest issues here is bandwidth. Going from XP to Vista our deployment has, in most instances, trippled in size. With XP we were about 10G in size, and with Vista we are at about 30G. The problem with that is that our network infrastructure is inconsistent at best. There is nothing I can do about that and so that's the limitation that I have to deal with.
The first deployment that we did we did on an 8-port (that's what we had at the time) Gbit switch. That allowed us to do six machines at a time. Two ports were used by the net connection for windows updates and the nas for building from. Those six machines took three hours. Next we put twelve machines on a 100Mb switch. Those machines took four and a half hours. That evening we plugged the nas into the wall and started the rest of our machines (about 25). Sorry I don't have exact figures on this, but by the next morning several machines failed asking for the CD, I think 3 just failed with some deployment error, and 4-6 were still building.
So after looking at this I tried to think of different things to try to get things done faster. First off, I have to say that I am pretty happy with the loading on our nas. I don't know how well it should load, but I know that we were never able to load our old server pulling down ghost images like we are able to do on the nas (If we got more than 5 or 6 ghost sessions going some would start to fail). We are using a Qnap TS-409 Pro nas. One of the things that I thought about doing is copying the application install files down to the local machine and then have the task sequence fire them off the local hdd. Currently the installs are fired off the server/nas. I don't know if that would work in our environment because until the past year we always got the smallest hdd we could from the vendor because we did not really need any bigger drives. I also thought about other temporary network topologies, but nothing really stands out as a solution. So in the end we just opted for a 24-port Gbit switch and let em grind.
The biggest time hog for deployment/installing is Visual Studio 2008. The old version took forever to install. I think this one takes twice as long to install! This install just drives me insane. I have sat and watched the resource monitor for this install and it is just slow. First off the it looks like msiexec.exe needs to be ported over to being a multithreaded app. For most all of the VS install the processor is pegged right at 50%. (I've tried looking to see if there is any documentation on Windows Installer being either single threaded or multithreaded and I could not find anything, so I am assuming it is single threaded.) Office 2007 and CS3 take some time too, but they seem to be a fair amount better about dealing with network/cpu/hdd/memory usage for their installs. VS just seems to have a one track mind though. It is either using network or cpu or hdd, but when one spikes all the others fall off.
As for any errors that I've seen the most common one is a program give me an exit code of 13 and I cannot find a way for the deployment wizzard to accept that return code as ok. Other machines have failed up doing windows updates, which to me is not a show stopper. They will get their updates soon enough. One or two errors happened that caused us to just wipe the drive and start over. I don't remember the exact errors, but they were pretty odd ones about installing Vista and wiping the hdd did the trick.
The first deployment that we did we did on an 8-port (that's what we had at the time) Gbit switch. That allowed us to do six machines at a time. Two ports were used by the net connection for windows updates and the nas for building from. Those six machines took three hours. Next we put twelve machines on a 100Mb switch. Those machines took four and a half hours. That evening we plugged the nas into the wall and started the rest of our machines (about 25). Sorry I don't have exact figures on this, but by the next morning several machines failed asking for the CD, I think 3 just failed with some deployment error, and 4-6 were still building.
So after looking at this I tried to think of different things to try to get things done faster. First off, I have to say that I am pretty happy with the loading on our nas. I don't know how well it should load, but I know that we were never able to load our old server pulling down ghost images like we are able to do on the nas (If we got more than 5 or 6 ghost sessions going some would start to fail). We are using a Qnap TS-409 Pro nas. One of the things that I thought about doing is copying the application install files down to the local machine and then have the task sequence fire them off the local hdd. Currently the installs are fired off the server/nas. I don't know if that would work in our environment because until the past year we always got the smallest hdd we could from the vendor because we did not really need any bigger drives. I also thought about other temporary network topologies, but nothing really stands out as a solution. So in the end we just opted for a 24-port Gbit switch and let em grind.
The biggest time hog for deployment/installing is Visual Studio 2008. The old version took forever to install. I think this one takes twice as long to install! This install just drives me insane. I have sat and watched the resource monitor for this install and it is just slow. First off the it looks like msiexec.exe needs to be ported over to being a multithreaded app. For most all of the VS install the processor is pegged right at 50%. (I've tried looking to see if there is any documentation on Windows Installer being either single threaded or multithreaded and I could not find anything, so I am assuming it is single threaded.) Office 2007 and CS3 take some time too, but they seem to be a fair amount better about dealing with network/cpu/hdd/memory usage for their installs. VS just seems to have a one track mind though. It is either using network or cpu or hdd, but when one spikes all the others fall off.
As for any errors that I've seen the most common one is a program give me an exit code of 13 and I cannot find a way for the deployment wizzard to accept that return code as ok. Other machines have failed up doing windows updates, which to me is not a show stopper. They will get their updates soon enough. One or two errors happened that caused us to just wipe the drive and start over. I don't remember the exact errors, but they were pretty odd ones about installing Vista and wiping the hdd did the trick.
Thursday, May 8, 2008
Quick and dirty "spare" deployment point
In our environment we have some "problem" labs that we have to deploy Vista into. For one reason or another the network will not really handle deploying several machines at a time. Trying to remedy this solution I got us a nas to work off of. This presents some interesting challenges.
The deployment tool isn't really setup to handle a dual server type setup very well (if at all). I might be missing something that would make this really easy, but I don't think so. Other blogs I have read talk about dns round-robbin and load balancing and the like, but none of those scenarios really address the problems that I face -- slow connections to the server room, either because of a wan-link or a flakey network. At any rate I got us a small nas so that I would have a local resource we could build from in any lab.
What I did was create shares on the nas that are named the same as the ones that are on our primary server. I use two different shares for the deployment and the applications. I then copied all the files over from each share to the corresponding one I made on the nas. After doing this I had to modify a few things to point to the nas.
One of the first things that I had to modify was the bootstrap.ini file to point to my nas -- but that was easier said than done. There might be an easier way to do it than what I put here, but this is what I did. I took the LTI iso and mounted it so I could read/write to it. I then had to open the boot.wim in the Sources folder that is in the iso (for the time being I'm going to leave it to your google-foo to figure that out -- if anyone asks I'll write that up later). I edited the bootstrap.ini in there and saved all of it back out and burned a boot cd.
From there I had to modify serveral files on the nas to point to the nas:
\Control\Applications.xml
and
\Control\*\TS.xml
* is the task sequence number of any task sequence you might use from the nas
I did all of this outside of the deployment tool because I did not want to make two deployment points and I didn't want to duplicate task sequences and have to edit both of them for each change I make (well I have to edit it anyway, but I would rather use a txt editor to do a search and replace rather than using the task sequence GUI).
That's my quick and dirty way to use another server to deploy from. Good luck.
The deployment tool isn't really setup to handle a dual server type setup very well (if at all). I might be missing something that would make this really easy, but I don't think so. Other blogs I have read talk about dns round-robbin and load balancing and the like, but none of those scenarios really address the problems that I face -- slow connections to the server room, either because of a wan-link or a flakey network. At any rate I got us a small nas so that I would have a local resource we could build from in any lab.
What I did was create shares on the nas that are named the same as the ones that are on our primary server. I use two different shares for the deployment and the applications. I then copied all the files over from each share to the corresponding one I made on the nas. After doing this I had to modify a few things to point to the nas.
One of the first things that I had to modify was the bootstrap.ini file to point to my nas -- but that was easier said than done. There might be an easier way to do it than what I put here, but this is what I did. I took the LTI iso and mounted it so I could read/write to it. I then had to open the boot.wim in the Sources folder that is in the iso (for the time being I'm going to leave it to your google-foo to figure that out -- if anyone asks I'll write that up later). I edited the bootstrap.ini in there and saved all of it back out and burned a boot cd.
From there I had to modify serveral files on the nas to point to the nas:
\Control\Applications.xml
and
\Control\*\TS.xml
* is the task sequence number of any task sequence you might use from the nas
I did all of this outside of the deployment tool because I did not want to make two deployment points and I didn't want to duplicate task sequences and have to edit both of them for each change I make (well I have to edit it anyway, but I would rather use a txt editor to do a search and replace rather than using the task sequence GUI).
That's my quick and dirty way to use another server to deploy from. Good luck.
Friday, April 25, 2008
Using Robocopy
In part of my deployment I use robocopy to copy over a batch file for later use by the system. I am now to the point in my deployment where I am working on the "little" things and getting the kinks out of them. After my task sequence ran robocopy it would fail because robocopy returned an exit code of 1. After doing a bit of digging I found that means everything worked out just fine. So if you are using robocopy in any of your task sequences you will want to find that/those step(s) and go to the options tab and edit a few things. I put a check in front of "Continue on error" so that copying over one file is not going to derail my deployment. I also added in a 1 in the "Success codes:" box. I have not ever touched this box before, but I have faith that is going to solve my problem with robocopy. Be aware that you may have to put other robocopy "success" codes in to make your particular deployment not bark at you.
Friday, March 28, 2008
Booting off USB
Found a little note in the MSDT help files on booting off USB. I had a couple little gotchas when doing this, and I'll let you know what they are. One of the gotchas is in their instructions so I'll give you my instructions (this wipes your flash drive, just so you know).
1 Open admin cmd prompt.
2 Run "diskpart".
3 Run diskpart command "list disk".
4 Make note of the disk number your flash drive is.
5 Run diskpart command "select disk (drive number)".
6 Run diskpart command "clean".
7 Run diskpart command "create partition primary".
8 Run diskpart command "select partition 1".
9 Run diskpart command "active".
10 Run diskpart command "format fs=fat32".
11 Run diskpart command "assign"
12 Run diskpart command "exit"
Then the instructions tell you to use xcopy to copy the contents of your boot CD to the flash drive. Xcopy would not work for me. I just copied the contents of the CD over to the flash drive and it seems to work just fine.
Remember to disconnect your flash drive before your machine needs to reboot or your sequence will fail.
1 Open admin cmd prompt.
2 Run "diskpart".
3 Run diskpart command "list disk".
4 Make note of the disk number your flash drive is.
5 Run diskpart command "select disk (drive number)".
6 Run diskpart command "clean".
7 Run diskpart command "create partition primary".
8 Run diskpart command "select partition 1".
9 Run diskpart command "active".
10 Run diskpart command "format fs=fat32".
11 Run diskpart command "assign"
12 Run diskpart command "exit"
Then the instructions tell you to use xcopy to copy the contents of your boot CD to the flash drive. Xcopy would not work for me. I just copied the contents of the CD over to the flash drive and it seems to work just fine.
Remember to disconnect your flash drive before your machine needs to reboot or your sequence will fail.
Custom script move OU failing with 0x8007052E error.
After upgrading and messing around with some new hardware (looking at a NAS for a mobile build server) I started getting errors with the custom OU moving scripts that I have been using for a while. I copied my deployment point over to the NAS I'm looking at and made the changes needed in the bootstrap.ini and applications.xml files. I tried my deployment off the NAS and when it got to the Z-MoveComputer_StagingOU.wsf script it failed with an error code of 0x8007052E (-2 something dec -- sorry didn't write it down). After doing a fair amount of trouble shooting I found out that off the NAS it did not like the following line in the code (probably 2/3 of the way down):
Set objContainer= openDS.OpenDSObject("LDAP://" & strDC & "/" & strStagingOU, strAccounttoJoinWith & "@" & strDomain, strAccountPassword, ADS_SECURE_AUTHENTICATION)
In particular I had to make admin@domain into doman\admin in order for the script to function. If you are seeing this error then see if swapping that helps you. The NAS is not on the domain and our other deployment server is. I don't know why this would need to change on one vs. the other, but I don't make the rules, I just play the game.
Set objContainer= openDS.OpenDSObject("LDAP://" & strDC & "/" & strStagingOU, strAccounttoJoinWith & "@" & strDomain, strAccountPassword, ADS_SECURE_AUTHENTICATION)
In particular I had to make admin@domain into doman\admin in order for the script to function. If you are seeing this error then see if swapping that helps you. The NAS is not on the domain and our other deployment server is. I don't know why this would need to change on one vs. the other, but I don't make the rules, I just play the game.
Thursday, February 28, 2008
Custom script for automating app install by machine name.
Phew what a title! Ok so it took me about a week to get this little script functioning. The reason that I came up with this script is that I really could not find a way to have automated application installs by machine name that fit into our "ecosystem". The old way we did application installs in XP and before was by a vbscript. I'm not opposed to the old way, but the old way that we did it had no error checking. What I really wanted to do was come up with something that would tell me what application failed to install if one or more did fail. Since this is already handled in MSDT I really wanted to leverage that.
I don't claim to be a programmer. I'm a hardware guy. I'm sure that someone could come up with more eloquent ways of putting this script together (and if you do please let me know), but what I have works for me. In our environment we have labs that need a basic set of software installed and then they have different sets depending on what lab that is. We have tried as best as we can to standardize our machine names, and that helps a lot. Thus, one of the things this script does is strip out the lab name from the machine name (ie BB100-01 is lab BB100 machine number 01).
The script then looks at a file I've called sourcetxt.txt to see what software needs to get installed where. Because of my ability with scripting everything has to be jumbled up together making it hard to read, but if the data is not in the correct format then the script dies. The data in this file takes the form of:
LabID=All:AppID=keyaccess
LabID can be either All or the name of the lab (BB100 from above). The : is the field delimiter. The AppID is the "name" of the application to be installed. I say it like that because the name is actually a file with that exact name + .txt that has the GUID in it for the application. It can have multiple GUIDs in it. I did this because we key our apps and thus I have the actual app install and then I have the keyed exe to install over it (If I ever get the time to devote to learning key server I could learn to deputize installers, but that's a whole other topic right there).
Included with the zip file are three files: \Z-CUSTOM_LabApps.wsf, \Applications\sourcetxt.txt, and \Applications\winscp.txt
I put my script in the scripts folder and I made a subdir called Applications in that folder to house the two files that references.
After the script strips the lab name out, reads the apps required in the lab and then gets the associated GUIDs of those apps it then writes those values back into the VARIABLES.DAT file. This then allows the MSDT framework to handle the app install and report back any app install errors. If you have applications specified in your CustomSettings.ini file then you will have to modify the script to start with the number that you left off with.
I created my own custom task sequence to run this script. I run it during the Preinstall\New Computer Only section after the "Copy scripts" built-in task runs.
Hope it helps!
Download the LabApps.zip here.
I don't claim to be a programmer. I'm a hardware guy. I'm sure that someone could come up with more eloquent ways of putting this script together (and if you do please let me know), but what I have works for me. In our environment we have labs that need a basic set of software installed and then they have different sets depending on what lab that is. We have tried as best as we can to standardize our machine names, and that helps a lot. Thus, one of the things this script does is strip out the lab name from the machine name (ie BB100-01 is lab BB100 machine number 01).
The script then looks at a file I've called sourcetxt.txt to see what software needs to get installed where. Because of my ability with scripting everything has to be jumbled up together making it hard to read, but if the data is not in the correct format then the script dies. The data in this file takes the form of:
LabID=All:AppID=keyaccess
LabID can be either All or the name of the lab (BB100 from above). The : is the field delimiter. The AppID is the "name" of the application to be installed. I say it like that because the name is actually a file with that exact name + .txt that has the GUID in it for the application. It can have multiple GUIDs in it. I did this because we key our apps and thus I have the actual app install and then I have the keyed exe to install over it (If I ever get the time to devote to learning key server I could learn to deputize installers, but that's a whole other topic right there).
Included with the zip file are three files: \Z-CUSTOM_LabApps.wsf, \Applications\sourcetxt.txt, and \Applications\winscp.txt
I put my script in the scripts folder and I made a subdir called Applications in that folder to house the two files that references.
After the script strips the lab name out, reads the apps required in the lab and then gets the associated GUIDs of those apps it then writes those values back into the VARIABLES.DAT file. This then allows the MSDT framework to handle the app install and report back any app install errors. If you have applications specified in your CustomSettings.ini file then you will have to modify the script to start with the number that you left off with.
I created my own custom task sequence to run this script. I run it during the Preinstall\New Computer Only section after the "Copy scripts" built-in task runs.
Hope it helps!
Download the LabApps.zip here.
Subscribe to:
Posts (Atom)