We all are afraid of losing important data, right? Therefore we backup our stuff to external hard drives (or NAS systems, or the cloud). The problem with external harddrives is: They fail. In unexpected ways. If you hear the Click of Death, it's probably too late. Flipped bits, corrupted filesystems, lost files: You better do not trust your external harddrive. But how can one find out if all files living on a harddrive are still readable and not corrupted?
I wrote a tool, "Backcheck", that does this. It takes two directories "SRC" and "DEST" and compares these with each other - recursively. For every file in SRC, it checks if the file in DEST exists, has the same file length and has the same checksum. For calculating the checksum, Backcheck reads the SRC file and the DEST file, each one in its entirety. That might be slow for big files/directory structures. But hey - you want to be sure that every bit is in its right place, right?
Backcheck also supports a record/verify mode: Backcheck records each files checksum and stores it in a record file. Later, you can start Backcheck in verify mode and point it to the record file. Backcheck will then check the original files' checksum against what has been stored in the record file. Record/verify is especially useful for archives, that is directories that you move to your external harddrive and delete from your main computer's disk.
Backcheck is a command-line application, that means it's easily scriptable.
It's written in Java, so it should run an any system supporting Java.
Minimum required java version is 1.6. Check it out on github: