auto-mount VHD Disks

On Server 2008 R2 and newer, VHD files can be mounted using Windows Disk Management MMC (diskmgmt.msc):

vhd1

You can mount a VHD using the “Action” Menu:

vhd2

But as soon as the Server gets restarted, you need to re-mound the VHD File manually. Microsoft provides no way to do this automatically using onboard tools. So you have the following choice:

  1. either create a batch script that calls fsutil.exe to mount the VHD file on startup
  2. or to use the cool and easy VHDattach Tool: http://www.jmedved.com/vhdattach

This tools allows you to open existing VHD files and select “auto-mount” (see screenshot):

vhd3

I mounted a Backup Exec Dedup Store as VHD file. For my example, it was necessary to re-configure the Dedup Service to Delayed start in services.msc.

Advertisements

Windows Defragmentation Decision Process

Windows Server 2012 Defrag does not just defragment Volumes like in earlier versions. There’s a decision process behind, that selects the appropriate method for each volume.

Decision Process

The following commands are based on the new Optimize-Volume PowerShell cmdlet. Most of the parameters correspond to defrag.exe’s parameters. The Decision process works like this.

# For HDD, Fixed VHD, Storage Space:
Optimize-Volume -Analyze -Defrag

# Tiered Storage Space
Optimize-Volume -TierOptimize

# SSD with TRIM support
Optimize-Volume -Retrim

# Storage Space (Thinly provisioned), SAN Virtual Disk (Thinly provisioned), Dynamic VHD, Differencing VHD
Optimize-Volume -Analyze -SlabConsolidate -Retrim

# SSD without TRIM support, Removable FAT, Unknown
No operation.

Graphical Defrag Tool

The classical GUI Tools for Defrag still extists. If you open it, you’ll see theres a predefined schedule for a weekly defragmentation of your system volume. Depending on the type of storage you’re using, defrag only will run a short trim or other optimization at that time. In virtualized environments, you have either thin provisioned storage from vSphere or from storage. Because of this, Defrag will not start a classical defragmentation anymore on VM’s. Instead, a re-Trim / Slab-consolidation will start and takes only a few seconds / minutes to complete (depends on size).

PowerShell cmdlet

Server 2012 R2 also has a PowerShell cmdlet called “Optimize-Volume” that can be used instead of the classic defrag.exe tool. Both can handle the same functions, the cmdlet has an additional StorageTier Optimization function for Storage Spaces.

Information about the cmdlet is here:
http://technet.microsoft.com/en-us/library/hh848675.aspx

Search Service Cluster Edition

Windows Server File Services is a classic well known Service form Microsoft. If you use the Search bar on the top of every Windows Explorer Window since Windows 7, your fileserver will respond very fast with an result, but only if Search Service is installed. If not, you’ll see a slow and long time working search, that displays one file found aftern another.

Setup Steps

If you follow this order of steps, you’ll have success:

  • Configure Search Service as described below
  • Move Clustered File Server with all Drives to the other Node
  • Configure Search Service on other Node(s)
  • Setup Clustered Service (details at the end of article)

Here are the detailed configuration steps:

Search Service Configuration

Because Search Index can be used for multiple drives of a file server / cluster, we will use an additional, clustered Drive using letter S. The following configuration steps must be dont on both cluster nodes individually, while the file server cluster role is active on that node.

  • Create folder S:\Search, if it doen’t already exist
  • Stop service “Windows Search” and set startup type to “manual”

To force windows to use the new search index location also after a index reset, the following registry key must be modified.

HKLM\Software\Microsoft\Windows Search\
DataDirectory -> S:\Search\Data\
DefaultDataDirectory -> S:\Search\Data\

  • start Service „Windows Search“
  • check folder content: die search put in some files here?

Now we configure the folders to be indexed. The easiest way would be using the GUI in control panel. For easy access, just create a desktop icon for this command:

control /name Microsoft.IndexingOptions

  • click on “modify” to de-select existing indexed locations
  • add all to-be-indexed shares
  • stop Windows Search Service

Configuration complete – on this node. Now the same steps are required on the other node too.

Setup Clustered Generic Service

After configuring both Nodes with the steps above, we can create a Clustered Generic Service for Windows Search.

  • start Failover Cluster Manager
  • Add a “Generic Service” under your fileserver’s Role
  • Open Properties of the new Service and add a Dependency for Drive S:
  • right-click on the Search Service and choose “bring online” to start
  • test if Failover works by doing Failover and re-check the Search Configuration

Done.

Sources:

Quick Guide to Setup Dynamic Access Control

Dynamic Access Control enables the functionality to deploy file or object permissions based on claims instead of access control lists (acl). As claims, we use pre-defined properties from active directory user objects like department, Title or Manager information. Almost everything from Active Directory can be used. So configuring DAC works like this:

Let’s assume, you want to restrict access on all Folders with “TopSecret” Confidentiality for users in Department “Agents” of Company “Investigate&Co”.

(1) Define Claims.

Open AD Administrative Center and go to Dynamic Access Control, where you open Claim Types. Now add all Properties you wish to use for filtering and grouping all of your Employees.

Claims will be used to give permissions. For our example, we add Department and Company.

1

(2) Define Resource Properties.

Still in AD Administrative Center, select Resource Properties in the left navigation tree. There appears a list of available Properties that can be enabled for use. For our example, enable Confidentiality and right-click on it to edit the properties. At the bottom, there’s a dialog to add Suggested Values. Hit “add” and create a additional “Top Secret” Entry with Value “4000” (just greater than the highest).

2

(3) Configure a Recource Property List.

There’s already a default configured “Global Resource Property List” that contains the most of the default available values. Just ensure the “Confidentiality” Property is listed here.

3

(4) Create a Central Access Rule.

File System Permissions are no longer configured on the file server itself, the cool thins is they’re now configured central and maintained general. So we don’t define a access policy for Folder XY and make decisions about recursion like done in past. What we do now is to define what kind of data can be accessed by what type of user.

For our example, we create this rule: Objects classified as “Top Secret” can only be accessed by users that are members of Department “Agents” and work for Company “Investigate&Co”. This Rule looks like this:

4

(5) Create a Central Access Policy.

A policy is a collection of all or many Central Access Rules. You assign a policy to a file server. With this functionality, you can define multiple and/or different policies for different servers.

5

(6) Configure Kerberos and KDC to support claims and Kerberos armoring.

Edit the Default Domain Controller Policy to enable the following Kerberos and KDC setting.

6

…and for Kerberos too…

7

(the difference betweek this two pictures is the selection of KDC and Kerberos in the left tree)

(7) Assign the Policy to the Fileserver.

Create a GPO either on root and use security filtering or link it directly to a OU that only contains the file server(s). This OU gets the following settings.

8

Under Computer Configuration/Windows Settings/Security Settings/File System, there’s a “Central Access Policy” Setting where you can define the DAC Policy we just created.

(8) Setup Fileserver, Verify Permissions.

Your file server needs the “File Server” role and “File Server Resource Manager” (FSRM) being installed to have the classification tab enabled on folders. Using “gpudate /force”, we ensure the new policy gets downloaded.

To verify our example rule is running, I created a folder called “Obama” and set the Confidentiality manually to “Top Secret”.

9

This isn’t the way you will set the properties on your data (mind the huge effort to do this on big filers). In production, you create rules in FSRM to automatically classify data based on rules. But this is another story.

So after classifying my folder, let’s use old good known “Effective Access” Tab in the Advanced Security Settings of the Folder to verify access of my User “Agent007” who’s in Department “Agents” and works for Company “Investigate&Co” and for JuniorAgent123 who’s in Department “Junior Agents”.

Folder NTFS Security Settings:

10

Effective Permissions for Agent007:

11

Effective Permissions for JuniorAgent123:

12

Conclusion

NTFS permissions gives the basic permissions for the users on objects and the DAC is used to restrict access by classification rules on top of it.

(9) Using File Classification Infrastructure of FSRM

If you open FSRM under the “Classification Management” tree, you see our enabled “Confidentiality” Property for Objects in here als “Global” Scope Property.

13

The Classification Rules tree is empty by default, you can create Rules here to classify files and folders. For example, a rule could classify files by scanning their content for credit card numbers and assign them the classification “Financial Data”. To get some examples and templates, there’s a downloadable package from Microsoft:

14

http://www.microsoft.com/en-us/download/details.aspx?id=27123

This Solution Accelerator is designed to help enable an organization to identify, classify, and protect data on their file servers. The out-of-the-box classification and rule examples help organizations build and deploy their policies to protect critical information on the file servers in their environment.

JetPack for DHCP DB maintenance missing?

During my learning courses of Server 2012, I just tried to do a DHCP Database maintenance using JetPack. I really didn’t found that executable, so I also tried doing the same under Server 2008r2. No success. Know why? JetPack is only installed in combination with the WINS Role. Who does still use WINS?!? (Sorry for that.)

So if you don’t want to install the WINS Role only to get the JetPack executable back, there is one other way.

  1. Open Explorer, Browse to %windir%\System32
  2. Use the Search Box and enter “JetPack”
  3. Copy the executable to %windir%\System32\dhcp
  4. Run your maintenance

Source:

Technet Article; Jetpack.exe on Windows 2008 server

KB145881 How to Use Jetpack.exe to Compact a WINS or DHCP Database

Boot directly from vhd File

Yes I knew about mounting vhd files as drive in Windows 8/2012, and ISO files can also be mounted directly in windows explorer. But here’s a very easy way to mount AND directly add the vhd image into BCD boot menu to boot up from:

(1) Copy the Extracted VHD file to C:\BootVHD\Server2012.vhd

(2) Mount the copied VHD file as a virtual Drive Letter

  • Right-click on the “Command Prompt” shortcut and select “Run as Administrator”
  • run “DISKPART.EXE” from the Command Prompt
  • At the “DISKPART>” prompt type the following commands, pressing Enter after each:
  • SELECT VDISK FILE=”C:\BootVHD\Server2012.vhd”
  • ATTACH VDISK
  • EXIT

(3) Wait for the VHD file to be mounted as new Drive Letter.  When completed, this new drive letter will display in “My Computer” and “Windows Explorer”

(4) Add a new OS Boot Menu Choice for Windows Server 2012

  • Right-click on the “Command Prompt” shortcut and select “Run as Administrator”
  • Run “BCDBOOT <mounted_drive_letter>:\WINDOWS” from the Command Prompt

(5) Reboot and select “Windows Server 2012” for the OS Boot Menu displayed

Done.

Source: http://blogs.technet.com/b/keithmayer/p/earlyexpertlabsetup.aspx#.UYtQlsp0bIg

Dedup Report on Server 2012

Some of you maybe already heard of this new outstanding feature coming with Windows Server 2012. I used the chance to test De-duplication with 4TB of backup data on a server at my workplace.

BTW: don’t confuse with “nobody used that”-feature on Server 2008. Server 2012 has a real Dedup function now.

Total size of physical Partition: 5.46 TB
Dedup started: 21-December-2012 / 21.12.2012 (dates are in dd.mm.yyyy format)

Date

Free___ Space

Used___ Space

Unopti-mizedSize

Saved___ Space

Savings Rate

InPolicy FilesCount

Optimized FilesCount

21.12.2012

375 GB

5090 GB

5090 GB

0 GB

0 %

18479

0

09.01.2013

1120 GB

4340 GB

5320 GB

999 GB

18 %

17751

8368

13.01.2013

3330 GB

2120 GB

6070 GB

3950 GB

65 %

18180

18186

16.01.2013

3210 GB

2250 GB

5930 GB

3680 GB

62 %

18001

18003

20.01.2013

3340 GB

2110 GB

5660 GB

3540 GB

62 %

18623

18627

23.01.2013

3220 GB

2240 GB

5560 GB

3320 GB

59 %

18410

18414

27.01.2013

3310 GB

2150 GB

5750 GB

3600 GB

62 %

18668

18676

30.01.2013

3150 GB

2310 GB

5610 GB

3300 GB

58 %

18045

18031

UnOptimizedSize equals to the real size of the Data on the volume.

2-4-2013 10-03-58 AM

After about three weeks warm-up time, Dedup started being very efficient at January 13. Since then, data was written and deleted daily but it seems like there wasn’t any performance nor free space bottleneck in that time.