It's so horrible that this is still necessary, but in order to keep time in sync across your network and computers, you'll have to drop into the registry.
Rather than re-type someone else's excellent work, I'll just link you to the article on WindowsNetworking.com
My journey through Windows Server 2008 R2 Foundation
Setting up and using Windows Server 2008 R2 Foundation in a small architecture firm with two locations
Thursday, November 10, 2011
Monday, December 20, 2010
DFS notes
There are a few notes I want to make about managing DFS - since it's really the main reason our offices are using servers, you can imagine I like all the tools I can get to keep an eye on it. Fortunately, these have greatly increased since Server 2003!
First, the Replication Diagnostic Health Reports are incredibly useful - dare I say more useful (and a lot easier) than wading through the event logs for the File services (which are, as I'm doing the initial replications, creating upwards of 20,000 events every 24 hours...even with filters, this is a daunting volume to try and make sure you haven't missed anything).
To create one, expand Roles, File Services, DFS Management and Replication in turn in the Server Manager window. You should see a list of all your replication groups, so right-click whichever one you want to create a report on and choose, "Create Diagnostic Report". The wizard will ask you for the type of report you want to create; in this case we want to create a "Health Report". In the next screen, choose where to save the report and if you want to rename it. In the next screen, make sure the servers you want to check are included and click Next. Then choose if you want to include backlogged files and count file sizes for each member (I do both and it's pretty quick, even on shares of 100+ GB). Once the report is generated, it will open up in Internet Explorer automatically. If the bottom of the report says, "Report loading, please wait", you need to disable the enhanced security for Administrators in Internet Explorer (Server Manager for your local server -> Configure IE ESC), then reload the report.
There's a pretty good amount of information in the reports, but it surfaces the most important stuff right in the section headings - primarily in the ERRORS and WARNINGS sections. Items listed here don't necessarily correspond directly to warnings or errors in the event log - they're things you really do care about. If a server is still getting the initial replication, that will show up as a WARNING on that server. Mostly, this is the information I was looking for, so that was nice and easy! Drill through the rest of the info for lots more information - I was interested to see that RDC allwed me to save, on average, between 85% and 95% of the bandwidth for replication...a HUGE improvement over my poor old Server 2003 days!
There was one other event I found on one of our reports that I need to mention here though: an error that One or more replicated folders have content skipped by DFS Replication.
Well, I didn't like that message at all, so I was happy to find out the reports give great details. In my case, it listed all the "Files that cannot be replicated under the replicated folder", under the heading that tells me they're not replicating because they're either temporary or symbolic links.
Running attrib.exe on the files listed didn't show me anything, so I found this excellent article that told me what tool to use in order to see if the files in question were, in fact, flagged as temporary.
They were.
I have no idea why AutoCAD marked them as temporary, but nobody who has ever had to manage AutoCAD installs will be surprised that it's AutoCAD's fault.
It's a bit harder to clear the temporary flag on the files, but the article nicely provided a PowerShell command that will remove the temporary flag on all files in a directory, in this case d:\data:
Not bad!
Scheduling Replication Health Reports - Now that all my errors should be clear, I thought it would be beneficial, especially in the first few weeks of using the new servers, to get a fresh report every morning to check for replication issues. But even though we've only got 8 replicated folders under our DFS root, it's still a bit of a pain to create the report for each replication group manually. Fortunately, I found a TechNet Article onto which someone had appended a useful link and bit of code - it's the actual command you can use to create a report; in this case, from a batch file.
So I created a batch file with eight lines, each resembling the following:
First, the Replication Diagnostic Health Reports are incredibly useful - dare I say more useful (and a lot easier) than wading through the event logs for the File services (which are, as I'm doing the initial replications, creating upwards of 20,000 events every 24 hours...even with filters, this is a daunting volume to try and make sure you haven't missed anything).
To create one, expand Roles, File Services, DFS Management and Replication in turn in the Server Manager window. You should see a list of all your replication groups, so right-click whichever one you want to create a report on and choose, "Create Diagnostic Report". The wizard will ask you for the type of report you want to create; in this case we want to create a "Health Report". In the next screen, choose where to save the report and if you want to rename it. In the next screen, make sure the servers you want to check are included and click Next. Then choose if you want to include backlogged files and count file sizes for each member (I do both and it's pretty quick, even on shares of 100+ GB). Once the report is generated, it will open up in Internet Explorer automatically. If the bottom of the report says, "Report loading, please wait", you need to disable the enhanced security for Administrators in Internet Explorer (Server Manager for your local server -> Configure IE ESC), then reload the report.
There's a pretty good amount of information in the reports, but it surfaces the most important stuff right in the section headings - primarily in the ERRORS and WARNINGS sections. Items listed here don't necessarily correspond directly to warnings or errors in the event log - they're things you really do care about. If a server is still getting the initial replication, that will show up as a WARNING on that server. Mostly, this is the information I was looking for, so that was nice and easy! Drill through the rest of the info for lots more information - I was interested to see that RDC allwed me to save, on average, between 85% and 95% of the bandwidth for replication...a HUGE improvement over my poor old Server 2003 days!
There was one other event I found on one of our reports that I need to mention here though: an error that One or more replicated folders have content skipped by DFS Replication.
Well, I didn't like that message at all, so I was happy to find out the reports give great details. In my case, it listed all the "Files that cannot be replicated under the replicated folder", under the heading that tells me they're not replicating because they're either temporary or symbolic links.
Running attrib.exe on the files listed didn't show me anything, so I found this excellent article that told me what tool to use in order to see if the files in question were, in fact, flagged as temporary.
They were.
I have no idea why AutoCAD marked them as temporary, but nobody who has ever had to manage AutoCAD installs will be surprised that it's AutoCAD's fault.
It's a bit harder to clear the temporary flag on the files, but the article nicely provided a PowerShell command that will remove the temporary flag on all files in a directory, in this case d:\data:
Get-childitem D:\Data -recurse | ForEach-Object -process {if (($_.attributes -band 0x100) -eq 0x100) {$_.attributes = ($_.attributes -band 0xFEFF)}}
Not bad!
Scheduling Replication Health Reports - Now that all my errors should be clear, I thought it would be beneficial, especially in the first few weeks of using the new servers, to get a fresh report every morning to check for replication issues. But even though we've only got 8 replicated folders under our DFS root, it's still a bit of a pain to create the report for each replication group manually. Fortunately, I found a TechNet Article onto which someone had appended a useful link and bit of code - it's the actual command you can use to create a report; in this case, from a batch file.
So I created a batch file with eight lines, each resembling the following:
dfsradmin health new /RgName:domain.tld\root\data /refmemname:domain\server /repname:c:\dfsreports\data.html /fscount:truewhere "root" is the shared folder used by the DFS root and "data" is the replicated directory (I just named the report the same thing as the replicated directory). Set that up as a scheduled task to run daily, and I'm all set!
Wednesday, December 15, 2010
Policies, Users and Computers
Finally, the last step before we can start getting clients on the domain!
In this step, we'll set up user accounts and make some policies for the computers belonging to our domain. Fortunately, there are only a few non-intuitive parts of this process.
One of the big benefits of being on a domain is that the administrator can set all kinds of policies to control how the computers work - if it's usually configurable by a user, it's probably able to have a policy set on it to override it. It sounds like policies are about removing control from the user, but in reality, they're mostly used to customize the computer so that they don't have go to EVERY computer and set hundreds of little options. For instance, one of my major policies specifies that the "Offline Files" feature should be turned on, with appropriate folders automatically made always available offline, without any user interaction at all.
There are two kinds of policies: computer policies (which apply to any user logging on to that particular computer) and user policies (which apply to a user, no matter which computer they log on to). As with everything else in Windows, it uses the "folder" model - policies are applied to a folder, and anything in it (including sub-folders and their contents) get the policy applied. However, if two policies conflict, the "lowest" policy will overrule the "higher one". So if I apply a policy turning on Offline Files to the folder containing all the domain computers, but then make a sub-folder and apply a policy that turns offline files off, any computers in the sub-folder will have offline files turned off.
OK, let's get started.
Open up your Server Manager window, and under "Roles", expand "Active Directory Domain Services", "Active Directory Users and Computers", and finally expand the item named with your domain name.
Among the folders (here, they're actually OUs - Organizational Units) are one named "Computers" and another named "Users". This is the default place where new users and new computers will go. Unfortunately, for reasons I can't begin to imagine, you can't apply policies to these folders. So I actually go in and create new OUs called "EmployeeComputers" and "EmployeeUsers".
It will work if you left it like this, but you'll have to manually go in every time you add a user or computer to the domain and drag them into the correct folder - which is a pain. It's much better to actually make those the default containers.
Now, go up to the top of the window and in the "View" menu, check "Advanced Features". Now right-click on each of those OUs you created and choose "Properties". Go over to the "Attribute Editor" and scroll down to "distinguishedName". It'll look something like
Now, if I haven't lost you yet, here's the other part that you'd never be able to logic through - to set them as the default containers for those object types, you need to drop out to the commandline.
Click start, type
Go to system32:
Each command should say it completed sucessfully, then you can close the console.
Now, onto actually making policies for the domain.
Click Start, and under "Administrative Tools", open "Group Policy Management".
This window will look fairly similar due to the fact that it shows (most of) the same OUs as the "Users and Computers". There are a few more object scattered among the OUs however - they look like little script document icons. These are policies. By default, there's a "Default Domain Policy" applied just under your Domain, and expanding the "Domain Controllers" OU will show you the other default policy, "Default Domain Controllers Policy". Click these and go to the "Settings" tab to see what all they do. (It's a lot).
Now, it is possible to just edit these policies and just have one massive policy (well, two, since the Domain Controllers do need to be more locked down than any other computers on the network) with ALL the settings you want to set in it, but I find it much easier to make lots of policies that pretty much do one thing each, and apply them as appropriate.
To make a policy, right-click the "Group Policy Objects" icon and choose "New". Name it according to what you want it to do, then poke around the settings to make it do what you want. Once you close it, you'll see it's now a little icon under "Group Policy Objects" - drag it from here up to whatever OU you want it to apply to.
For instance, I've got a policy that changes the password requirements. (Those settings are in Computer Configuration\Policies\Windows Settings\Security Settings\AccountPolicies\Password Policy). That policy is applied to the whole domain, right under Default Domain Policy so it overrides whatever settings I apply in it.
Turning on Offline files has to happen at the computer level, so I create a policy called "Offline Files On", set Computer Configuration\Policies\Administrative Templates\Network\Offline Files as I want, then put that on my "EmployeeComputers" OU. Setting specific offline files happens at the user level, so I make another policy, set User Configuration\Policies\Administrative Templates\Network\Offline Files\Administratively Assigned Offline Files as I wish, and link that on my "EmployeeUsers" OU.
Other policies I set include Disabling EFS so no data can be lost if I have to reset someone's password, adding Domain Users to the Administrators group of the client machines, enabling Remote Desktop and allowing it through the firewall, etc. You can even use policies to set the wallpaper, screensaver, homepage, etc...like I said, if it's Microsoft software and configurable, there's probably a policy for it.
Once your policies are all set, go back to Active Directory Users and Computers, right-click your "EmployeeUsers" OU and start adding "New" "User"s. You'll be able to specify their first password, whether they have to change it the first time they log on, their name, etc. On the "Profile" tab of their properties, pay special attention to the "Home Folder" option - here's where you can automatically map Z:\ (for instance) to your DFS share at \\domain.tld\root
Once you've got one user set up like you want, you can also right-click their name and choose "Copy" to make an identical user - it will only prompt you for a different name, password and username.
OK, I think we're ready - let's start getting computers on the domain!
In this step, we'll set up user accounts and make some policies for the computers belonging to our domain. Fortunately, there are only a few non-intuitive parts of this process.
One of the big benefits of being on a domain is that the administrator can set all kinds of policies to control how the computers work - if it's usually configurable by a user, it's probably able to have a policy set on it to override it. It sounds like policies are about removing control from the user, but in reality, they're mostly used to customize the computer so that they don't have go to EVERY computer and set hundreds of little options. For instance, one of my major policies specifies that the "Offline Files" feature should be turned on, with appropriate folders automatically made always available offline, without any user interaction at all.
There are two kinds of policies: computer policies (which apply to any user logging on to that particular computer) and user policies (which apply to a user, no matter which computer they log on to). As with everything else in Windows, it uses the "folder" model - policies are applied to a folder, and anything in it (including sub-folders and their contents) get the policy applied. However, if two policies conflict, the "lowest" policy will overrule the "higher one". So if I apply a policy turning on Offline Files to the folder containing all the domain computers, but then make a sub-folder and apply a policy that turns offline files off, any computers in the sub-folder will have offline files turned off.
OK, let's get started.
Open up your Server Manager window, and under "Roles", expand "Active Directory Domain Services", "Active Directory Users and Computers", and finally expand the item named with your domain name.
Among the folders (here, they're actually OUs - Organizational Units) are one named "Computers" and another named "Users". This is the default place where new users and new computers will go. Unfortunately, for reasons I can't begin to imagine, you can't apply policies to these folders. So I actually go in and create new OUs called "EmployeeComputers" and "EmployeeUsers".
It will work if you left it like this, but you'll have to manually go in every time you add a user or computer to the domain and drag them into the correct folder - which is a pain. It's much better to actually make those the default containers.
Now, go up to the top of the window and in the "View" menu, check "Advanced Features". Now right-click on each of those OUs you created and choose "Properties". Go over to the "Attribute Editor" and scroll down to "distinguishedName". It'll look something like
OU=EmployeeComputers,DC=domain,DC=com. Write that down or copy-n-paste it into notepad or something. Do the same thing for the EmployeeUsers too. When you're done, go back to "view" and un-check "Advanced Features".
Now, if I haven't lost you yet, here's the other part that you'd never be able to logic through - to set them as the default containers for those object types, you need to drop out to the commandline.
Click start, type
cmdand hit enter. You'll get a black console window with a C:\ prompt. Yes, seriously.
Go to system32:
cd c:\windows\system32I know it's a 64-bit OS, all the critical tools are still in System32. The commands you need are "redirusr" and "redircmp", followed by a space and the appropriate distinguishedName of the OU. For instance:
redirusr OU=EmployeeUsers,DC=domain,DC=com
redircmp OU=EmployeeComputers,DC=domain,DC=com
Each command should say it completed sucessfully, then you can close the console.
Now, onto actually making policies for the domain.
Click Start, and under "Administrative Tools", open "Group Policy Management".
This window will look fairly similar due to the fact that it shows (most of) the same OUs as the "Users and Computers". There are a few more object scattered among the OUs however - they look like little script document icons. These are policies. By default, there's a "Default Domain Policy" applied just under your Domain, and expanding the "Domain Controllers" OU will show you the other default policy, "Default Domain Controllers Policy". Click these and go to the "Settings" tab to see what all they do. (It's a lot).
Now, it is possible to just edit these policies and just have one massive policy (well, two, since the Domain Controllers do need to be more locked down than any other computers on the network) with ALL the settings you want to set in it, but I find it much easier to make lots of policies that pretty much do one thing each, and apply them as appropriate.
To make a policy, right-click the "Group Policy Objects" icon and choose "New". Name it according to what you want it to do, then poke around the settings to make it do what you want. Once you close it, you'll see it's now a little icon under "Group Policy Objects" - drag it from here up to whatever OU you want it to apply to.
For instance, I've got a policy that changes the password requirements. (Those settings are in Computer Configuration\Policies\Windows Settings\Security Settings\AccountPolicies\Password Policy). That policy is applied to the whole domain, right under Default Domain Policy so it overrides whatever settings I apply in it.
Turning on Offline files has to happen at the computer level, so I create a policy called "Offline Files On", set Computer Configuration\Policies\Administrative Templates\Network\Offline Files as I want, then put that on my "EmployeeComputers" OU. Setting specific offline files happens at the user level, so I make another policy, set User Configuration\Policies\Administrative Templates\Network\Offline Files\Administratively Assigned Offline Files as I wish, and link that on my "EmployeeUsers" OU.
Other policies I set include Disabling EFS so no data can be lost if I have to reset someone's password, adding Domain Users to the Administrators group of the client machines, enabling Remote Desktop and allowing it through the firewall, etc. You can even use policies to set the wallpaper, screensaver, homepage, etc...like I said, if it's Microsoft software and configurable, there's probably a policy for it.
Once your policies are all set, go back to Active Directory Users and Computers, right-click your "EmployeeUsers" OU and start adding "New" "User"s. You'll be able to specify their first password, whether they have to change it the first time they log on, their name, etc. On the "Profile" tab of their properties, pay special attention to the "Home Folder" option - here's where you can automatically map Z:\ (for instance) to your DFS share at \\domain.tld\root
Once you've got one user set up like you want, you can also right-click their name and choose "Copy" to make an identical user - it will only prompt you for a different name, password and username.
OK, I think we're ready - let's start getting computers on the domain!
Configuring Backups
The approach I take to backups is a very multi-tiered approach. In our office a lot of the concerns about theft or damage to the physical office resulting in a loss of data is mitigated by DFS and the fact that our data is nearly instantly replicated between our offices (located in different cities); the likelihood of anything physically happening to both offices at the same time is very remote.
A far more common danger is user mistakes - files are deleted that shouldn't have been, or changes are overwritten and need to be rolled back. The biggest problem here is the fact that you never know what point in time you'll need to roll back to. Fortunately, Windows Server includes a number of technologies we can take advantage of to be able to handle almost any need that's arisen.
Enter "Shadow Copies". If you use Windows Vista or Windows 7, you may be slightly familiar with this technology; it's called "Previous Versions" in the file properties. The best part, other than the simplicity, is that it's available to any user on the domain - they don't really NEED to ask me to restore this file or that.
Shadow Copies work at the partition level - you keep shadow copies of an entire drive letter or not at all. It works by taking a "snapshot" of the drive when you first set it up, then twice a day (by default) it looks for any files that have been updated since the last snapshot and grabs a copy of the current version. When someone goes into the "Restore Previous Versions" dialog on the client machine, they're presented with a list of all available updates. If they're looking at a file, they'll see just the timestamps of the file when shadow copies saw an update. If they're looking at a whole folder, they'll see every shadow copy time available. In either case, they can restore the file or directory in-place, or open it up to make sure it's the right version, or restore it to an alternate place. It's incredibly useful, and takes a surprisingly small amount of disk space to keep several months' worth of twice-a-day snapshots.
To set up Shadow copies, open up "Computer" and right-click the drive on which your data resides, choosing "Configure Shadow Copies". If you just select "Enable" on the resulting dialog, it will just use the defaults; store shadow copies on the same volume as the data itself with some default size limits. I much perfer highlighting the drive, then clicking "Settings" to choose a drive on which to keep the backups and configure max sizes (when you hit the limit, it starts deleting shadow copies, oldest first...so you've got kind of a moving window of times you can roll files back to). For several reasons, it's best to locate the shadow copies on a different drive than the data itself; performance is far better on separate (physical) drives, and also that way if the drive stops working, the shadow copies don't die along with the data itself.
Once the options are set, click OK to return to the Shadow Copies dialog, then "Enable" the shadow copies. It will take an initial snapshot right away, then proceed as scheduled from there on out.
(By the way, if you ever want to stop making new shadow copies but not erase the shadow copies you've already made, don't click the "Disable" button [which deletes all existing shadow copies] - instead just delete the scheduled task that captures the copies - the "Next Run Time" will change to "Disabled", but the shadow copies will still remain available for opening or restoring).
Easy, isn't it? And it's a GREAT tool!
A far more common danger is user mistakes - files are deleted that shouldn't have been, or changes are overwritten and need to be rolled back. The biggest problem here is the fact that you never know what point in time you'll need to roll back to. Fortunately, Windows Server includes a number of technologies we can take advantage of to be able to handle almost any need that's arisen.
Enter "Shadow Copies". If you use Windows Vista or Windows 7, you may be slightly familiar with this technology; it's called "Previous Versions" in the file properties. The best part, other than the simplicity, is that it's available to any user on the domain - they don't really NEED to ask me to restore this file or that.
Shadow Copies work at the partition level - you keep shadow copies of an entire drive letter or not at all. It works by taking a "snapshot" of the drive when you first set it up, then twice a day (by default) it looks for any files that have been updated since the last snapshot and grabs a copy of the current version. When someone goes into the "Restore Previous Versions" dialog on the client machine, they're presented with a list of all available updates. If they're looking at a file, they'll see just the timestamps of the file when shadow copies saw an update. If they're looking at a whole folder, they'll see every shadow copy time available. In either case, they can restore the file or directory in-place, or open it up to make sure it's the right version, or restore it to an alternate place. It's incredibly useful, and takes a surprisingly small amount of disk space to keep several months' worth of twice-a-day snapshots.
To set up Shadow copies, open up "Computer" and right-click the drive on which your data resides, choosing "Configure Shadow Copies". If you just select "Enable" on the resulting dialog, it will just use the defaults; store shadow copies on the same volume as the data itself with some default size limits. I much perfer highlighting the drive, then clicking "Settings" to choose a drive on which to keep the backups and configure max sizes (when you hit the limit, it starts deleting shadow copies, oldest first...so you've got kind of a moving window of times you can roll files back to). For several reasons, it's best to locate the shadow copies on a different drive than the data itself; performance is far better on separate (physical) drives, and also that way if the drive stops working, the shadow copies don't die along with the data itself.
Once the options are set, click OK to return to the Shadow Copies dialog, then "Enable" the shadow copies. It will take an initial snapshot right away, then proceed as scheduled from there on out.
(By the way, if you ever want to stop making new shadow copies but not erase the shadow copies you've already made, don't click the "Disable" button [which deletes all existing shadow copies] - instead just delete the scheduled task that captures the copies - the "Next Run Time" will change to "Disabled", but the shadow copies will still remain available for opening or restoring).
Easy, isn't it? And it's a GREAT tool!
Serving files: DFS
DFS (Distributed File Services) is an incredibly useful set of services for setups like mine. If you only have one server in one office you might not need it, but for anything more than that, I really do recommend you look into it and utilize it.
At it's heart, DFS is kind of just a list of shared folders available on your network. But the list is actually presented as a single shared folder with sub-folders. Those sub-folders are the shares you want to list. Which means that instead of mapping a different drive letter for each share, you can map just one drive letter for the "list", and access the "real" shared folders like sub-folders.
Say you have three file shares your users need:
Instead of your users having to remember the paths (which would change if you ever had to replace a server) or having to map three different drive letters to the three shares, you could use DFS to publish the folders under the shared folder (which is called a "DFS Root") \\domain.com\shares. They could then map a single drive letter (let's say Z:) to that share, and the three shares just become sub-folders (called "DFS Targets":
Easy, right? Well, there are a lot of implications to the technology.
For one, since the shares are just published by DFS, the actual shared folders don't all have to be on the same hard drive...or even the same computer! Also, and most powerfully, each target can actually map to MULTIPLE identical shares on different computers. Say you had \\docserver01 and \\docserver02 and they each had an "accounting" share on them; z:\accounting can actually point to BOTH of these shares. The original idea was that if you had to reboot docserver01 for some reason, people can keep working on the share since docserver02 would still be available...they wouldn't even know one of the target computers was offline.
Of course, in that case, you have to make sure that each of those shares remain identical at all times - which is why DFS also includes replication.
DFS is really identical for offices like ours with multiple locations because I can have a full copy of all our data on each server (that is, in each office). No matter which office a user is in, they can connect to the share (since the root of the share is the domain, rather than a specific computer), and thanks to your Active Directory sites, DFS will know which share is in the same location and point the user to that server, rather than forcing them out over the VPN/WAN link to the "other" office. Also, thanks to DFS replication, as soon as a user saves a file in one office, it's replicated to the other office so everyone sees all the same data, all the time. In a lot of ways, it also has the benefit of being an off-site backup for each office. Even if one or the other office burns down, for instance, all the data has been replicated to the other office (which is presumably not burning down at the same time), so no data is lost. I think the only thing it doesn't protect against is someone actually gaining access to your network and trashing your files (which would of course get replicated to both machines)...but that's what real backups are for, right?
OK, now that you know WHY we're using DFS, I'll get into the how.
Open up your "Server Manager" window. Right-click "roles" and choose "Add new role". DFS is part of "File Services", so select that and choose "Next". For which sub-roles you'd like to install, choose "File Server", "Distributed File System" (which will auto-select both of the entries within it), and Windows server 2003 file services and indexing services.
Now it will ask you if you want to create a DFS share now, which you can do...but I had a lot of prep work to do first, so I chose "Create Later", then click "Install".
It's possible, if this is your first venture into servers, that you don't have a huge data pool you need to bring forward. If that's the case, you can just create your DFS share right now and set up all the structure as you go, knowing that as you (or your users) do add data it will get replicated everywhere you tell it to.
But it's more likely that you've already got a bunch of data you need to make available. Depending on exactly what form it's in currently, you may have a lot of work as I did.
Your main task is to get all your data where you want it to be. It may be on other servers in your office (I'm pretty sure the DFS targets have to be hosted by actual Windows Server OSes, not normal client Windows machines), or maybe you want to move it all on to your new server.
My biggest challenge was that, as our old Server 2003 servers died, they stopped replicating (our files just ended up getting to big for the old technology to handle), so I had to manually create a single pool of data that represented all the latest files, drawing from two existing servers (in a different domain, of course), each of which might have newer files than the other scattered throughout all the shares.
I ended up using Robocopy quite a bit - at the commandline, entering lines like
then reversing the orders of \\oldsvr01 and \\oldsvr02 gave me a list of all the files that were newer on the first server than the second server. I figured out which list was the longest, then used a command like
to get those files into place on the new server. Unless there were exceptional cases calling for a manual copy-n-paste of one file or another, I then used a command like
But hopefully you don't have to do any of that.
Once you've finally got all the latest files ready to go, there's just one more thing I want to mention. In my case, we've got two servers for two offices, but when I was setting them up, I had the luxury of having them both together in one office. So once I had all the files on one server on the drives I wanted them on, I just copied them right over to the second server myself, rather than asking DFS to replicate all the files (we're well over 100GB right now) to a blank share. DFS after 2003-era Windows Servers allow you to "seed" the data in this way - when it starts replication it sees that the data is identical and doesn't re-send it over the network. Which is very nice. Otherwise I'd be waiting for all of that 100GB to get sent to the other office over the fairly slow internet connection.
So, your data is all in place. Only one thing left to do: make sure the appropriate folders are shared and all permissions are correct.
In my case, I've got all of our data on one drive in each server. We have 8 shares, and they're all right in the root of the drive, with nothing else on the drive. So I actually go to the security settings for that drive and erase all permissions, then I go back in and set "Domain Users" and "System" to have "Full Control", and replace all permissions on all folders and files within the drive. (FYI, you have to give "System" full control if you want shadow copies, which we'll get to in the next article). You may want more fine-grained control over permissions, which is fine. Your DFS shares do also have the option of turning on "Access-Based Enumeration", which just means that it will actually hide any shares or folders that a user doesn't have permission to access.
Now I went through each of our eight folders I wanted to publish as shares. I turned on sharing and made sure that the share permissions were set to allow "Domain Users" to have "Full Control". One more thing I like doing: I add a dollar sign ($) to the end of the share name...so the share name might look like "docs$". That's just a little code that tells Windows not to display the share if someone browses to the server over the network. It's still available: if they type in the path in the address bar it'll open, they just can't double-click to open it from a list.
One more share you need: an empty folder which will serve as the "root" of your DFS share. This can be anywhere; you won't be putting any files into it. The only oddity is that if it's mapped to a drive letter on the client machine, the "free space" displayed for the drive is whatever free space there is on the drive where the root is hosted...which probably doesn't have anything to do with the space actually available for the data in the DFS shares. The share name here should be the same as the "root" you give the DFS share (see below). Don't put a dollar sign at the end of this share, since it's the one you want to actually be visible, accessible, and used.
Got all your shares set up? Great. Time to get started!
Open up your "Server Manager" window. Expand "Roles", "File Services" and "DFS Management". Right-click "Namespaces" and choose, "New Namespace".
(FYI, and this single fact is half the reason I wanted to publish this blog, the information available for Foundation Edition with regards to DFS is confusing and contradictory. Despite what anything you read implies, I'm here to tell you, and I've tried it, YOU CAN MAKE AS MANY DOMAIN-BASED NAMESPACES AS YOU WANT WITH FOUNDATION EDITION. You are limited to a single stand-alone namespace (which looks like \\server\root rather than \\domain.tld\root and is not available if that single server is not turned on), but you can have multiple domain-based namespaces). You would not believe what it took for me to get that answer.
Start by providing one of your domain controller names as the "namespace server". It doesn't matter which one; we'll add the other one to the identical configuration in a moment.
Now give the name of the root you want. This will look like the share name, if you're used to normal file sharing. So if it's mapped to Z: on users' machines, your domain is "domain.com" and you call it "Files", the "name" of the drive will look like:
If you don't have a share on the machine that's the same name as the root name you give it, it will prompt you to create a new share. You probably want to click "Edit Settings" and make sure it has the permissions you want it to have.
The next window will prompt you if you want to create a Domain-based namespace or a Stand-alone namespace. There's almost no reason you wouldn't want a Domain-based namespace in this case, and I like the features it gives if you also "Enable Windows Server 2008 mode".
The next steps will just confirm your settings and create the root.
Now we want to add your other server as a root server too, so that either can go down without any impact on your file availability. Under "Namespaces" in the "Server Manager" window, you'll now see your DFS root. Right-click it and choose, "Add Namespace Server". Give it the name of your other DC, and make sure that (if you haven't already made the root share on the second server) the shared folder it is going to create has all the right settings.
Now you're ready to start adding the targets (which will look like subfolders of your DFS share). Under "Namespaces" in the "Server Manager" window, you'll now see your DFS root. Right-click it and choose, "New Folder".
It will ask you for the name of the folder. Call it whatever name you want your users to see as a sub-folder of the DFS root...this one doesn't have to be the same as the share name. Then, choose to "Add" a folder target, and browse to the appropriate one of the shares you created earlier with your data - the ones that end in dollar signs. If you have multiple shares, on whatever servers, that should be identical, add those too.
If you add multiple servers, it will then ask you if you want to enable replication on the shares (you do). Set that up however you'd like, but do pay special attention to the fact that you can tell the replication how much bandwidth to use for replication at any time of any day of the week. That might be interesting to you if your servers share bandwidth, as mine do, with a VoIP system (for instance)...throttling replication back to a smaller amount of bandwidth will keep it from breaking up your voice traffic.
One more thing I like to do with multiple sites: Right-click the DFS root and go to properties. On the "Referrals" tab, I change the "ordering method" to "exclude targets outside of the client's site". That ensures that, no matter what, the clients will not be directed to open any files across the slow VPN/WAN connection. This should be unnecessary, due to the site costing and transports you set up earlier, but at least in Server 2003, I had some issues with clients getting the wrong referral, resulting in very poor performance.
WOW, that was a long one! But DFS is awesome - you'll be glad you've got it.
At it's heart, DFS is kind of just a list of shared folders available on your network. But the list is actually presented as a single shared folder with sub-folders. Those sub-folders are the shares you want to list. Which means that instead of mapping a different drive letter for each share, you can map just one drive letter for the "list", and access the "real" shared folders like sub-folders.
Say you have three file shares your users need:
\\docserver\accounting
\\docserver\files
\\docserver\company
Instead of your users having to remember the paths (which would change if you ever had to replace a server) or having to map three different drive letters to the three shares, you could use DFS to publish the folders under the shared folder (which is called a "DFS Root") \\domain.com\shares. They could then map a single drive letter (let's say Z:) to that share, and the three shares just become sub-folders (called "DFS Targets":
z:\accounting
z:\files
z:\company
Easy, right? Well, there are a lot of implications to the technology.
For one, since the shares are just published by DFS, the actual shared folders don't all have to be on the same hard drive...or even the same computer! Also, and most powerfully, each target can actually map to MULTIPLE identical shares on different computers. Say you had \\docserver01 and \\docserver02 and they each had an "accounting" share on them; z:\accounting can actually point to BOTH of these shares. The original idea was that if you had to reboot docserver01 for some reason, people can keep working on the share since docserver02 would still be available...they wouldn't even know one of the target computers was offline.
Of course, in that case, you have to make sure that each of those shares remain identical at all times - which is why DFS also includes replication.
DFS is really identical for offices like ours with multiple locations because I can have a full copy of all our data on each server (that is, in each office). No matter which office a user is in, they can connect to the share (since the root of the share is the domain, rather than a specific computer), and thanks to your Active Directory sites, DFS will know which share is in the same location and point the user to that server, rather than forcing them out over the VPN/WAN link to the "other" office. Also, thanks to DFS replication, as soon as a user saves a file in one office, it's replicated to the other office so everyone sees all the same data, all the time. In a lot of ways, it also has the benefit of being an off-site backup for each office. Even if one or the other office burns down, for instance, all the data has been replicated to the other office (which is presumably not burning down at the same time), so no data is lost. I think the only thing it doesn't protect against is someone actually gaining access to your network and trashing your files (which would of course get replicated to both machines)...but that's what real backups are for, right?
OK, now that you know WHY we're using DFS, I'll get into the how.
Open up your "Server Manager" window. Right-click "roles" and choose "Add new role". DFS is part of "File Services", so select that and choose "Next". For which sub-roles you'd like to install, choose "File Server", "Distributed File System" (which will auto-select both of the entries within it), and Windows server 2003 file services and indexing services.
Now it will ask you if you want to create a DFS share now, which you can do...but I had a lot of prep work to do first, so I chose "Create Later", then click "Install".
It's possible, if this is your first venture into servers, that you don't have a huge data pool you need to bring forward. If that's the case, you can just create your DFS share right now and set up all the structure as you go, knowing that as you (or your users) do add data it will get replicated everywhere you tell it to.
But it's more likely that you've already got a bunch of data you need to make available. Depending on exactly what form it's in currently, you may have a lot of work as I did.
Your main task is to get all your data where you want it to be. It may be on other servers in your office (I'm pretty sure the DFS targets have to be hosted by actual Windows Server OSes, not normal client Windows machines), or maybe you want to move it all on to your new server.
My biggest challenge was that, as our old Server 2003 servers died, they stopped replicating (our files just ended up getting to big for the old technology to handle), so I had to manually create a single pool of data that represented all the latest files, drawing from two existing servers (in a different domain, of course), each of which might have newer files than the other scattered throughout all the shares.
I ended up using Robocopy quite a bit - at the commandline, entering lines like
robocopy \\oldsvr01\share1 \\oldsvr02\share1 /l /mir /r:0 /ndl
then reversing the orders of \\oldsvr01 and \\oldsvr02 gave me a list of all the files that were newer on the first server than the second server. I figured out which list was the longest, then used a command like
robocopy \\oldsvr01\share1 \\newsvr01\share1 /r:0
to get those files into place on the new server. Unless there were exceptional cases calling for a manual copy-n-paste of one file or another, I then used a command like
robocopy \\oldsvr02\share1 \\newsvr01\share1 /e /r:0 /xo /xlto copy just the files that were newer on oldsvr02 into the new server.
But hopefully you don't have to do any of that.
Once you've finally got all the latest files ready to go, there's just one more thing I want to mention. In my case, we've got two servers for two offices, but when I was setting them up, I had the luxury of having them both together in one office. So once I had all the files on one server on the drives I wanted them on, I just copied them right over to the second server myself, rather than asking DFS to replicate all the files (we're well over 100GB right now) to a blank share. DFS after 2003-era Windows Servers allow you to "seed" the data in this way - when it starts replication it sees that the data is identical and doesn't re-send it over the network. Which is very nice. Otherwise I'd be waiting for all of that 100GB to get sent to the other office over the fairly slow internet connection.
So, your data is all in place. Only one thing left to do: make sure the appropriate folders are shared and all permissions are correct.
In my case, I've got all of our data on one drive in each server. We have 8 shares, and they're all right in the root of the drive, with nothing else on the drive. So I actually go to the security settings for that drive and erase all permissions, then I go back in and set "Domain Users" and "System" to have "Full Control", and replace all permissions on all folders and files within the drive. (FYI, you have to give "System" full control if you want shadow copies, which we'll get to in the next article). You may want more fine-grained control over permissions, which is fine. Your DFS shares do also have the option of turning on "Access-Based Enumeration", which just means that it will actually hide any shares or folders that a user doesn't have permission to access.
Now I went through each of our eight folders I wanted to publish as shares. I turned on sharing and made sure that the share permissions were set to allow "Domain Users" to have "Full Control". One more thing I like doing: I add a dollar sign ($) to the end of the share name...so the share name might look like "docs$". That's just a little code that tells Windows not to display the share if someone browses to the server over the network. It's still available: if they type in the path in the address bar it'll open, they just can't double-click to open it from a list.
One more share you need: an empty folder which will serve as the "root" of your DFS share. This can be anywhere; you won't be putting any files into it. The only oddity is that if it's mapped to a drive letter on the client machine, the "free space" displayed for the drive is whatever free space there is on the drive where the root is hosted...which probably doesn't have anything to do with the space actually available for the data in the DFS shares. The share name here should be the same as the "root" you give the DFS share (see below). Don't put a dollar sign at the end of this share, since it's the one you want to actually be visible, accessible, and used.
Got all your shares set up? Great. Time to get started!
Open up your "Server Manager" window. Expand "Roles", "File Services" and "DFS Management". Right-click "Namespaces" and choose, "New Namespace".
(FYI, and this single fact is half the reason I wanted to publish this blog, the information available for Foundation Edition with regards to DFS is confusing and contradictory. Despite what anything you read implies, I'm here to tell you, and I've tried it, YOU CAN MAKE AS MANY DOMAIN-BASED NAMESPACES AS YOU WANT WITH FOUNDATION EDITION. You are limited to a single stand-alone namespace (which looks like \\server\root rather than \\domain.tld\root and is not available if that single server is not turned on), but you can have multiple domain-based namespaces). You would not believe what it took for me to get that answer.
Start by providing one of your domain controller names as the "namespace server". It doesn't matter which one; we'll add the other one to the identical configuration in a moment.
Now give the name of the root you want. This will look like the share name, if you're used to normal file sharing. So if it's mapped to Z: on users' machines, your domain is "domain.com" and you call it "Files", the "name" of the drive will look like:
Files (\\domain.com) (Z:)
If you don't have a share on the machine that's the same name as the root name you give it, it will prompt you to create a new share. You probably want to click "Edit Settings" and make sure it has the permissions you want it to have.
The next window will prompt you if you want to create a Domain-based namespace or a Stand-alone namespace. There's almost no reason you wouldn't want a Domain-based namespace in this case, and I like the features it gives if you also "Enable Windows Server 2008 mode".
The next steps will just confirm your settings and create the root.
Now we want to add your other server as a root server too, so that either can go down without any impact on your file availability. Under "Namespaces" in the "Server Manager" window, you'll now see your DFS root. Right-click it and choose, "Add Namespace Server". Give it the name of your other DC, and make sure that (if you haven't already made the root share on the second server) the shared folder it is going to create has all the right settings.
Now you're ready to start adding the targets (which will look like subfolders of your DFS share). Under "Namespaces" in the "Server Manager" window, you'll now see your DFS root. Right-click it and choose, "New Folder".
It will ask you for the name of the folder. Call it whatever name you want your users to see as a sub-folder of the DFS root...this one doesn't have to be the same as the share name. Then, choose to "Add" a folder target, and browse to the appropriate one of the shares you created earlier with your data - the ones that end in dollar signs. If you have multiple shares, on whatever servers, that should be identical, add those too.
If you add multiple servers, it will then ask you if you want to enable replication on the shares (you do). Set that up however you'd like, but do pay special attention to the fact that you can tell the replication how much bandwidth to use for replication at any time of any day of the week. That might be interesting to you if your servers share bandwidth, as mine do, with a VoIP system (for instance)...throttling replication back to a smaller amount of bandwidth will keep it from breaking up your voice traffic.
One more thing I like to do with multiple sites: Right-click the DFS root and go to properties. On the "Referrals" tab, I change the "ordering method" to "exclude targets outside of the client's site". That ensures that, no matter what, the clients will not be directed to open any files across the slow VPN/WAN connection. This should be unnecessary, due to the site costing and transports you set up earlier, but at least in Server 2003, I had some issues with clients getting the wrong referral, resulting in very poor performance.
WOW, that was a long one! But DFS is awesome - you'll be glad you've got it.
Setting up Sites (geographical locations)
OK, so you've got a domain. Now you need to configure it to get it all set up the way you want it. A lot of this will depend greatly on how you're going to be setting up your domain, but here's what I've done. Adjust or ignore any of my posts labeled "4. Domain configuration" as needed for your situation.
The first thing we'll do is set up what Active Directory calls "Sites". FYI, this has NOTHING to do with websites. Since I've got two physical office locations, I'm going to set up two different geographical "sites" within Active Directory, so I can configure how they talk with each other. This isn't necessary if you have only one office.
Open your Server Manager. You can either do that by closing the "Initial Configuration Tasks" window (that action causes the Server Manager to open by default), or clicking the "Server Manager" button (which should be the first pinned icon in the task bar). You can also find it in the start menu.
This is a truly useful window, a one-stop-shop of sorts where you can get all kinds of information about what your server is doing and configure it. So on the left side of the window, expand "Roles" if it's not already. One of the entries below it is "Active Directory Domain Services". Click that, and the right side populates with event notifications from the last 24 hours, a list of services associated with the role, and even suggestions for what to do next for the best practices and experiences.
For now though, just go back to the left side of the window, and keep drilling down, expanding "Active Directory Sites and Services", "Sites" and "Inter-Site Transports" in turn. Now, under "Inter-Site Transports", click "IP". The right side of the window changes to show you a single item, probably called, "Default_IP_Site_Link". This item represents the internet connection between the servers in your different locations...there are all sorts of properties you can apply to it to govern how the servers use that link.
However, that name isn't very clear on what it is, so right-click on that and rename it to something that will be useful to you - something like "Inter-office WAN link" that actually tells you what it is. If you have several locations, you can even create multiple transports to really have fine-grained control on how they talk with each other, but I'll get back to that in a minute.
Once that's renamed, just go back up a few levels on the left side of the window and click "Sites" under "Active Directory Sites and Services". Again, the right side of the window will show you the two "sub-folders" under "Sites" in addition to a single actual "Site" object. It's also named something useless like, "Default_First_Site", so right-click on it and rename it to something better, like the name of the city your first location is in. Now right-click "Sites" on the left side of the window and choose to make a "New" -> "Site" to represent your other office. Part of that process is to choose the transport to use for this office - since there's only one for now, just choose it. Repeat for as many offices as you have.
Now, go to the first site - the one you renamed. There's a "sub-folder" under that site called "Servers". You'll find your Domain Controller in here. If this is the site it is actually in, great. Otherwise, drag it out into the "Servers" sub-folder of the site it should actually serve. Come back and do this whenever you add a new Domain Controller.
If you only have two sites, this part is done now. But if you have three or more, you may want to configure each link separately. Maybe two of your sites are always online but the third is only online during business hours, for instance. To set up different rules between each of the different sites, go back to "IP" under "Inter-Site Transports" and right click to make a "New Site Link". Name it appropriately, then choose the two sites that link should govern. Then right-click the first link and remove any site that shouldn't be governed by that transport.
Now go through each of your transports...right-click them and choose "Properties". From here, you can set a schedule for which hours the servers can talk with each other over the link, assign a "cost" for each link, etc. Costing is an interesting idea that you may want to look into, even if you only have two sites, if you have multiple internet connections.
For instance, if you have one connection that's for normal traffic or VOIP traffic and a separate internet connection dedicated to the server traffic, you'd set up a transport for each connection but assign different COSTS to them. The one dedicated to the server would be the lower cost so it would get used primarily. But if that link went down, it would try using the higher cost link to make sure the data gets through.
Anyway, there's one more thing to do: tell it how to figure out which site a computer is in automatically. Each physical location probably is using it's own subnet of IP addresses, assigned by the DHCP server in that office, so we tell the computers to look up which site their IP address is in to know where it is physically located each time it asks for an IP address.
Back up a level again and click on "Subnets" under "Sites". Right-click it to make a "New Subnet". Now use Network Prefix Notation to tell it which range of addresses belong to which site. For instance, 192.168.1.0/24 is any address in the 192.168.1.x range.
As a side note, sites are also VERY useful for DFS shares, which we'll get to later...and this time, it's for the client computer's benefit. So it really is worth it to get this set up.
The first thing we'll do is set up what Active Directory calls "Sites". FYI, this has NOTHING to do with websites. Since I've got two physical office locations, I'm going to set up two different geographical "sites" within Active Directory, so I can configure how they talk with each other. This isn't necessary if you have only one office.
Open your Server Manager. You can either do that by closing the "Initial Configuration Tasks" window (that action causes the Server Manager to open by default), or clicking the "Server Manager" button (which should be the first pinned icon in the task bar). You can also find it in the start menu.
This is a truly useful window, a one-stop-shop of sorts where you can get all kinds of information about what your server is doing and configure it. So on the left side of the window, expand "Roles" if it's not already. One of the entries below it is "Active Directory Domain Services". Click that, and the right side populates with event notifications from the last 24 hours, a list of services associated with the role, and even suggestions for what to do next for the best practices and experiences.
For now though, just go back to the left side of the window, and keep drilling down, expanding "Active Directory Sites and Services", "Sites" and "Inter-Site Transports" in turn. Now, under "Inter-Site Transports", click "IP". The right side of the window changes to show you a single item, probably called, "Default_IP_Site_Link". This item represents the internet connection between the servers in your different locations...there are all sorts of properties you can apply to it to govern how the servers use that link.
However, that name isn't very clear on what it is, so right-click on that and rename it to something that will be useful to you - something like "Inter-office WAN link" that actually tells you what it is. If you have several locations, you can even create multiple transports to really have fine-grained control on how they talk with each other, but I'll get back to that in a minute.
Once that's renamed, just go back up a few levels on the left side of the window and click "Sites" under "Active Directory Sites and Services". Again, the right side of the window will show you the two "sub-folders" under "Sites" in addition to a single actual "Site" object. It's also named something useless like, "Default_First_Site", so right-click on it and rename it to something better, like the name of the city your first location is in. Now right-click "Sites" on the left side of the window and choose to make a "New" -> "Site" to represent your other office. Part of that process is to choose the transport to use for this office - since there's only one for now, just choose it. Repeat for as many offices as you have.
Now, go to the first site - the one you renamed. There's a "sub-folder" under that site called "Servers". You'll find your Domain Controller in here. If this is the site it is actually in, great. Otherwise, drag it out into the "Servers" sub-folder of the site it should actually serve. Come back and do this whenever you add a new Domain Controller.
If you only have two sites, this part is done now. But if you have three or more, you may want to configure each link separately. Maybe two of your sites are always online but the third is only online during business hours, for instance. To set up different rules between each of the different sites, go back to "IP" under "Inter-Site Transports" and right click to make a "New Site Link". Name it appropriately, then choose the two sites that link should govern. Then right-click the first link and remove any site that shouldn't be governed by that transport.
Now go through each of your transports...right-click them and choose "Properties". From here, you can set a schedule for which hours the servers can talk with each other over the link, assign a "cost" for each link, etc. Costing is an interesting idea that you may want to look into, even if you only have two sites, if you have multiple internet connections.
For instance, if you have one connection that's for normal traffic or VOIP traffic and a separate internet connection dedicated to the server traffic, you'd set up a transport for each connection but assign different COSTS to them. The one dedicated to the server would be the lower cost so it would get used primarily. But if that link went down, it would try using the higher cost link to make sure the data gets through.
Anyway, there's one more thing to do: tell it how to figure out which site a computer is in automatically. Each physical location probably is using it's own subnet of IP addresses, assigned by the DHCP server in that office, so we tell the computers to look up which site their IP address is in to know where it is physically located each time it asks for an IP address.
Back up a level again and click on "Subnets" under "Sites". Right-click it to make a "New Subnet". Now use Network Prefix Notation to tell it which range of addresses belong to which site. For instance, 192.168.1.0/24 is any address in the 192.168.1.x range.
As a side note, sites are also VERY useful for DFS shares, which we'll get to later...and this time, it's for the client computer's benefit. So it really is worth it to get this set up.
Friday, November 19, 2010
The Biggie role: Active Directory
Congratulations! Your server should be all ready to be promoted to the first Domain Controller, thus creating your network domain!
To start, go to that "Initial Server Configuration" window and choose "Add Role". This time, select "Active Directory Domain Services".
It will give you some good general information and probably tell you there are some .net frameworks you need to install. Let it do so. Now what it's actually doing is kind of installing the installation files for Active Directory...it's not actually setting up the domain yet. One of those pieces of information it gave you was that once this is done, we'll have to run dcpromo.exe.
So, once that wizard starts, click start and type in "dcpromo" without the quotes, then hit Enter.
Now it ask you what domain you want to create (note that with Foundation edition, you can't make it a sub-domain - it has to be the root of your domain), then it will be using DNS to go out and find out who is "in charge" of the domain. Since you've configured it to use itself as the DNS server to ask, and you've also told it to respond to any requests for information about your domain name with information pointing to itself, it will get the answer back, "I am!". At that point, it allows you to create the domain and you're off and running. There are a few more questions it will ask you (passwords, etc), but you should be able to walk through the wizard fairly easily. Once it's done, it will naturally require a restart, after which your domain has been created.
To start, go to that "Initial Server Configuration" window and choose "Add Role". This time, select "Active Directory Domain Services".
It will give you some good general information and probably tell you there are some .net frameworks you need to install. Let it do so. Now what it's actually doing is kind of installing the installation files for Active Directory...it's not actually setting up the domain yet. One of those pieces of information it gave you was that once this is done, we'll have to run dcpromo.exe.
So, once that wizard starts, click start and type in "dcpromo" without the quotes, then hit Enter.
Now it ask you what domain you want to create (note that with Foundation edition, you can't make it a sub-domain - it has to be the root of your domain), then it will be using DNS to go out and find out who is "in charge" of the domain. Since you've configured it to use itself as the DNS server to ask, and you've also told it to respond to any requests for information about your domain name with information pointing to itself, it will get the answer back, "I am!". At that point, it allows you to create the domain and you're off and running. There are a few more questions it will ask you (passwords, etc), but you should be able to walk through the wizard fairly easily. Once it's done, it will naturally require a restart, after which your domain has been created.
Subscribe to:
Comments (Atom)