Jan 16 2011
6 Reasons NOT to Use Microsoft DFS Replication
Based on advice from a Microsoft techie, I decided to set up DFS (Distributed File System) on a few remote sites about a year ago. Sceptical at first, The product is so disappointing that I decided to warn people who haven’t made their mind up.
Here are 6 good reasons NOT to use DFS Replication
No Defragmentation
After defragmenting, a new replication occurs generating a huge traffic that could max out your lines bandwith for a few days if the folders size is big enough.
I haven’t tried but I wonder if the same thing wouldn’t happen when creating/removing the disk index?
File Locking
There’s no such thing, inter-site file locking, since there are 2 local copies of each files (or more).
Concurrent Access
Drawback from the previous point, data may be overwritten. If someone leaves a file open the whole day and saves it before he leaves the office, all changes made from another replication site will be lost.
Therefore, DFS is advised for write-protected files or files that can be modified from a single site.
Failover
If one of the DFS server becomes unavailable on a same site, some users are still redirected to that server and may lose access to their files. Wasn’t it one of DFS goals?
Of course, users are still able to manually select the DFS target but most of them probably don’t know the trick. The problem does not happen on all folders as a DFS server is randomly selected for each one of them. There’s also a cache that defaults to 1800 seconds, that’s 30 minutes.
Solution: Upgrade to Windows Server Enterprise which provides clustering and rising the price of your license.
Diagnostic Tools
No graphical tool (yet Microsoft speciality) is available to monitor what’s being replicated.
There are indeed those scripts connstat.cmd and iologsum.cmd written in Perl (Really!?) provided with Microsoft support tools. You first need to relocate the files in a path with no space or you’ll get the following error:
Can't open perl script "C:\Program": No such file or directory
Microsoft have posted a bug report on their website back in 2007 rather than fixing it!
Try to replace “@perl %~dpn0.cmd %*” by “@perl %0 %*” around the end of the script…
Script usage is not very intuitive and information not always relevant in my opinion. Usage with a practical example is available on Microsoft indeed.
Bandwith Throttling
Bandwith throttling isn’t possible as far as I know on Windows 2003 Server…
As well as that, to stop an ongoing replication, I stop the DFS service. Nothing happens, replication still goes on. I had to run ‘net stop ntfrs’ to stop the flow. Unfortunately, this also denies Active Directory from replicating if you’re on a domain controller.
In a nutshell, DFS has so many gaps and restrictions, it becomes difficult to use it for anything but content publishing across an enterprise.
Hello,
Thank you for your article. You talk about DFS replication but do not tell if it is FRS (the old generation replication) DFSR (the new generation, available from 2003 R2 I think, but maybe only since 2008).
By comparing the following article to yours: http://redmondmag.com/articles/2011/02/01/dfs-best-practices.aspx, could you tell me if you are still adamant?
I have to conduct a study for my internship and I am asked to replicate on two sites, but I find it hard to really understand why they want DFS Replication, unless it is for the backup but then there are other means. I\’ll dig with my boss in the meantime.
I must say that the DFS replication remains unclear because it does not seem to support versioning …
cdt,
Julien