Hasher-Server

Chunk-based remote file integrity checking for large transfers.

A Go HTTP server that compares large remote files to local ones by hashing byte ranges — designed for unreliable connections and very large archives.

GoHTTPLinux

Problem

  • Large file transfers can partially corrupt without obvious failure signals.
  • Re-downloading multi‑GB archives is expensive when only small ranges are damaged.

Solution

  • Exposed an HTTP interface that hashes file chunks (byte ranges) and compares results.
  • Enabled verification and targeted re-transfer of only the affected ranges.

Highlights

  • Chunk/range hashing for terabyte-scale workflows
  • Simple HTTP interface for easy integration
  • Designed around unreliable networks and partial corruption

Screenshots

Screenshots are not included yet. If you’d like, I can add screenshots and a short GIF walkthrough once a stable demo/deploy target is confirmed.

What I learned

  • Range-based workflows benefit from careful input validation and bounds checks.
  • Operational usability matters: clear parameters, predictable performance, and logs.

Notes

  • This is an open-source personal project (not client work).

Links