(no title)
catatsuy | 5 months ago
Files are hashed in parallel, so large sets can be processed quickly.
On repeated runs, unchanged files skip hashing with a default 90% probability using a cache. This keeps checks lightweight even at scale
catatsuy | 5 months ago
Files are hashed in parallel, so large sets can be processed quickly.
On repeated runs, unchanged files skip hashing with a default 90% probability using a cache. This keeps checks lightweight even at scale
No comments yet.