Finished a piece of code that determines effective timestamp resolution of a file system.

Every file system keeps track of when a file was created and when it was last modified. These timestamps are the natural properties of a file and they are readily available for inspection in the Windows Explorer or any file manager of your choice.

The less obvious aspect of timestamping is that timestamps have resolution. For example, FAT tracks file modification with a measly 2 second precision. In comparison, NTFS uses the 100 ns (nano-second) precision.

It is very typical for a backup software to rely on timestamps to determine if a file has been modified and requires copying. But if the source file sits on an NTFS volume and the backup goes onto a FAT disk, then comparing timestamp directly simply won't work, because FAT timestamp will be rounded-up to a 2-second mark.

In other words, the timestamp granularity needs to be taken into an account when comparing the timestamps. The question is how to determine what it is exactly.

First of all, the granularity for creation and modification times can be different. FAT has them at 10 ms and 2 s respectively, but NTFS has them both at 100 ns.

To complicate matters, there appears to be NAS devices that report using NTFS, but in reality having the granularity that is not 100 ns.

There are also NAS devices that mangle timestamps so spectacularly that it can't be explained in any rational way. Like rounding them up to 1 sec and then subtracting 1 ms.

So what Bvckup 2 does is it probes the file system and attempts to determine effective resolution of both timestamps. This involves dropping a temporary file and then trying to set its timestamps to this, that and 3rd and see what they end up at. From that it's possible to deduce the resolution.

In cases when such probing fails, the app falls back to guessing resolution by the file system name. It is also possible to override the resolution values via the config file, just in case.

Additionally, the program also keeps track of a maximum observed timestamp difference for the backup volume. That is, when replicating the timestamps, it will make note of the values it is asking to set and values that end up being actually recorded for a file or a folder.

This difference is used as a lower threshold for the probed and estimated resolutions and, over time, it converges to the actual volume resolution.
Made by Pipemetrics in Switzerland
Support

Follow
Blog / RSS
Social Twitter
Reddit
Miscellanea Press resources
Testimonials
Company
Imprint
Legal Terms
Privacy