Fragmentation is a fact - No OS is capable of keeping all files continous. This is due to several factors:
Files increase and decrease in size, or will be deleted and recreated. This process is not predictable. The OS tries to keep existing pieces of files in their place and only allocates new clusters when needed. These can be at a completely different location, thus leading to a fragmentation of the file. Now imagine, you are hosting a file server where dozens or even hundreds of users are working concurrently. This will definitely lead to a fragmentation.
Also, the level of free space fragmentation can increase significantly over time, as files will be deleted and newly created. This also is a key point for decreasing the performance of the file system.
Defragmentation is the only way to regain the performance and keep it at a peak level. Remember the days when Microsoft told us that NTFS does not fragment? Nowadays, they are packing Windows with a light defragger to give you the hint that this is mission critical for the OS!
Many hardware manufactures do not see the gain of doing regular defragmentation. PCs today are much faster than years ago. The percentage of the performance decrease remains the same.
Today's applications perform more frequent file accesses than years ago. Even on a workstation, you will still feel the fresh speed after a defragmentation.
This was first published in June 2002